Cloudera Data Platform Cdp Jobs in Usa

12,008 positions found

Data Platform Operator
✦ New
🏢 QUAD
$19 - 25
Duncan, SC 1 day ago

Unleash Your Potential at Quad – Don't Miss Out!

Ready to  supercharge your career  and make a lasting impact? At Quad, we're excited to welcome ambitious individuals who are driven to excel. Are you mechanically inclined or maintenance-savvy? Ready to take on a new challenge? Look no further! We're looking for motivated, detail-oriented individuals to join our vibrant team in  Spartanburg, SC. Your adventure to success begins now – grab this opportunity!

Our 82,000 sq. ft. facility in Spartanburg, SC, is a state-of-the-art packaging plant that serves a diverse range of clients, including medical, pharmaceutical, and well-known liquor and tobacco brands. We offer Sheetfed offset, and narrow-web flexo, along with die cutting and custom folding/gluing, all supported by advanced inline quality control systems. 

The facility is well-lit with both natural and artificial lighting,  climate-controlled , and impeccably clean. We take great pride in fostering a friendly, team-oriented atmosphere where everyone collaborates to achieve our goals!

Headquartered in Wisconsin, Quad is a marketing experience company that helps brands make direct consumer connections, from household to instore to online. The company does this through its MX Solutions Suite, a comprehensive range of marketing and print services that seamlessly integrate creative, production, and media solutions across online and offline channels. Supported by state-of-the-art technology and data-driven intelligence, Quad simplifies the complexities of marketing by removing friction wherever it occurs along the marketing journey. With approximately 11,000 employees in 11 countries, we serve around 2,100 clients, including industry-leading blue-chip companies that serve both business and consumers in multiple industry verticals, with a particular focus on commerce, including retail, consumer packaged goods, and direct-to-consumer; financial services; and health. Quad is ranked among the largest agency companies in the U.S. by Ad Age , buoyed by its full-service media agency, Rise , and creative agency, Betty . Quad is also one of the largest commercial printers in North America, according to Printing Impressions

Quad is seeking a Flexo Press Operator for our Spartanburg, SC location. We are looking for operators who are flexible and can work the night shift:

4 pm – 2 am (Mon-Thurs)

Wages start between $19.00 - $25.00 / hour or more based on relevant work experience and a strong employment history. 


Essential Functions of this position include:

  • Prepare for Operation - Access job ticket information and set up a flexographic printing press to produce labels and other products to customer specifications. Ensure the machine is adequately stocked with the correct raw materials for each job. 
  • Operate Flexographic Press - Operate assigned equipment in accordance with company safety standards and departmental SOPs to produce printed products according to customer specifications. Continually monitor supply levels of inks, paper, and other required materials to add as needed. Make routine adjustments as needed to maintain print quality and correct any issues as soon as possible.
  • Perform Troubleshooting & Maintenance - Observe and monitor machine operations to determine whether adjustments are needed. Perform basic maintenance and advanced troubleshooting of assigned equipment during shift.
  • Perform Quality Checks - Complete quality checklist(s) and other required documentation. Perform visual quality checks of the product throughout the printing process to ensure customer satisfaction. Flag bad product for removal from job run. Cut samples from each job and compare them to product standards to ensure compliance with customer specifications.
  • Perform Line Clearance - Clean assigned area by removing all products from the line, trash, boxes, and other supplies associated with a completed order.  

Required Knowledge, Skills, and Abilities include:

  • Knowledge of the setup and operation of a flexographic printing press,, specifically Mark Andy P5 experience preferred, but will consider those with experience on other models as well. 
  • Mechanical aptitude and skills to perform troubleshooting and maintenance.
  • Attention to detail and accuracy.
  • Excellent communication skills.
  • Ability to analyze problems for root causes and determine solutions.
  • Ability to match and detect differences in similar color shades and hues.
  • Ability to understand, remember, and apply/follow written and verbal instructions.
  • Ability to understand, remember, and communicate routine, factual information.
  • Ability to complete routine, existing forms.
  • Ability to organize one's schedule and tasks for efficient workflow and production.
  • Ability to perform tasks with room for personal interpretation; problem-solving involves a supervisor when needed.
  • Ability to count accurately.
  • Ability to add, subtract, multiply, and divide numerical data.
  • Ability to use measuring equipment to determine substrate sizes, etc. 

Working Conditions include:

  • Requires work with moving mechanical parts.
  • Requires work in a noisy, fast-paced environment where forklifts and other machinery are used.
  • Requires work at risk of electrical shock.

Additional Information

The actual rate of pay offered will vary based upon, but not limited to: education, skills, experience, proficiency, performance, shift, and location. In addition to base salary, depending on the role, the total compensation package may also include overtime and shift differentials. Quad offers benefits including medical, dental, and vision coverage, paid time off, disability insurance, annual discretionary match to 401(k) based on company performance, life insurance, and other voluntary supplemental insurance coverages, plus childbirth short-term disability insurance, paid parental leave, adoption & surrogacy benefits, pet insurance, and more!

If you're ready to take the next step in your career with Quad, apply today and become part of a team that values growth, innovation, and your potential to excel.

Not Specified
Sr. Digital Product Manager (Membership, Customer Data & Loyalty)
✦ New
🏢 Petco
Salary not disclosed
Want to help pets live their best lives?
We’re proud to be where the pets go and where the pet people go. If you want to make a real difference, create an exciting career path, feel welcome to be your whole self and nurture your wellbeing, Petco is the place for you.
Our core values capture that spirit as we work to improve lives by doing what’s right for pets and people.
  • Pet First – Protect & Empower. All pets should Live their Best Life. We put the needs of pets and pet parents at the center of everything we do.
  • Foster the Fun – Connect & Bond. Our Passion for pets brings us together! We celebrate the journey of pet parenthood through district experiences, products, and services.
  • Let’s Go! Own & Commit. We are stronger as One Petco team. We bring our unique superpowers and champion authenticity in everyone to drive success.
About Petco
We’re proud to be "where the pets go" to find everything they need to live their best lives for more than 60 years — from their favorite meals and toys, to trusted supplies and expert support from people who get it, because we live it. We believe in the universal truths of pet parenthood — the boundless boops, missing slippers, late night zoomies and everything in between. And we’re here for it. Every tail wag, every vet visit, every step of the way. We are 29,000+ strong and together we nurture the pet-human bond in more than 1,500 Petco stores across the U.S., Mexico and Puerto Rico, 250+ Vetco Total Care hospitals, hundreds of preventive care clinics and eight distribution centers. In 1999, we founded Petco Love. Together, we support thousands of local animal welfare groups nationwide and have helped find homes for approximately 7 million animals through in-store adoption events.
Membership, Customer Data & Loyalty
Position Overview
The Senior Digital Product Manager will lead digital product initiatives supporting Membership, Customer Data, and Loyalty programs for a $6B specialty retail organization. Will own the end-to-end product strategy and roadmap for customer identity, data platforms, and loyalty experiences across digital and in-store channels.
The ideal candidate brings deep expertise in customer data platforms (CDPs), identity resolution, loyalty ecosystems, personalization, and privacy governance, combined with strong business acumen and cross-functional leadership skills.
Key Responsibilities
Product Strategy & Vision
  • Define and execute the multi-year product strategy for Membership, Customer Data, and Loyalty platforms.
  • Develop and maintain a prioritized product roadmap aligned with enterprise growth, retention, and customer lifetime value (CLV) objectives.
  • Identify opportunities to leverage customer data to drive personalization, engagement, and revenue growth.
Customer Data & Platform Leadership
  • Lead development and optimization of customer data capabilities, including:
    • Identity resolution and profile unification
    • Data governance and compliance (GDPR, CCPA, etc.)
    • Segmentation and audience management
    • Real-time personalization enablement
  • Partner with Engineering and Data teams to evolve CDP, CRM, and marketing technology stacks.
  • Ensure scalable architecture to support omnichannel retail environments.
Membership & Loyalty Programs
  • Own digital product capabilities supporting loyalty enrollment, rewards management, tiering, promotions, and engagement campaigns.
  • Optimize customer lifecycle journeys from acquisition through retention.
  • Develop features that enhance member value proposition and drive repeat purchase behavior.
  • Measure and improve loyalty program ROI, retention rate, and lifetime value.
Cross-Functional Leadership
  • Lead agile product teams and collaborate closely with:
    • Engineering
    • Data Science & Analytics
    • Marketing & CRM
    • eCommerce
    • Store Operations
    • Finance & Legal
  • Serve as the voice of the customer and translate business objectives into clear product requirements.
  • Align stakeholders around KPIs and measurable outcomes.
Analytics & Performance
  • Define success metrics and KPIs (CLV, retention, engagement, incremental revenue, NPS).
  • Use data and experimentation (A/B testing, cohort analysis) to drive product decisions.
  • Build executive-level reporting and business cases for investment prioritization.
Required Qualifications
  • 5+ years of product management experience, with 3+ years in digital product leadership.
  • Deep expertise in customer data management, CDPs, CRM systems, and loyalty platforms.
  • Experience in retail, specialty retail, consumer brands, or omnichannel environments.
  • Proven track record of delivering data-driven personalization initiatives.
  • Strong understanding of privacy regulations and data governance frameworks.
  • Experience leading agile product teams and influencing cross-functional stakeholders.
  • Demonstrated ability to manage complex platform integrations and enterprise-scale systems.
Preferred Qualifications
  • Experience working in a multi-billion-dollar retail organization.
  • Background in subscription or membership-based business models.
  • Familiarity with leading CDP and CRM ecosystems (e.g., Salesforce, Adobe, Tealium, etc.).
  • MBA or advanced degree in business, technology, or related field.
Leadership Competencies
  • Strategic thinker with strong commercial acumen
  • Data-driven decision maker
  • Influential communicator with executive presence
  • Customer-obsessed mindset
  • Bias for action and measurable impact
  • Ability to operate in fast-paced, matrixed organizations
Impact of the Role
This role directly influences customer retention, personalization maturity, and revenue growth by shaping how the organization leverages its customer data assets. The Senior Digital Product Manager will play a critical role in strengthening membership value, loyalty engagement, and long-term customer relationships.
#CORP
Qualified applications with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act.
The pay ranges outlined below are presented in accordance with state-specific regulations. These ranges may differ in other areas and could be subject to variation based on regulatory minimum wage requirements. Actual pay rates will depend on factors such as position, location, level of experience, and applicable state or local minimum wage laws. If the regulatory minimum wage exceeds the minimum indicated in the pay range below, the regulatory minimum wage will be the minimum rate applied.
Salary Range: $103,800.00 - $155,700.00
Hourly or Salary Range will be reflected above. For a more detailed overview of Petco Total Rewards, including health and financial benefits, 401K, incentives, and PTO - see Animal Supplies, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, protected veteran status, or any other protected classification.
To translate this webpage to Spanish or other languages on your internet browser, click the translate button to the right of your browser address bar. Additional instructions can be found here: Google Chrome Help .
Para traducir esta página web al español u otros idiomas en su navegador de Internet, haga clic en el botón de traducción a la derecha de la barra de direcciones de su navegador. Puede encontrar instrucciones adicionales aquí: Google Chrome Ayuda.
Not Specified
Data Architect - Consumer Platform
✦ New
Salary not disclosed

The pay range for this role is $150,000 - $200,000/yr USD.


WHO WE ARE:


Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.


ABOUT THE ROLE:


Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.


The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).


You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.


WHAT YOU’LL DO:


  • Responsible for the full technical life cycle of consumer platform capabilities which includes:
  • Capability roadmap and technical architecture in alignment to consumer experience
  • Technical planning, design, and execution
  • Operations, analytics/reporting, and adoption
  • Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
  • Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
  • Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
  • Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
  • Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
  • Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
  • Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
  • Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
  • Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.


REQUIREMENTS:


  • Computer Science, Data Engineering, or related degree or equivalent experience.
  • 12+ years experience architecting enterprise data platforms in cloud environments.
  • 9+ years experience with data engineering with a focus on consumer data.
  • 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
  • Strong experience with Data 360 and identity resolution architectures.
  • Proven expertise in SQL performance tuning and large-scale data modeling.
  • Hands-on experience implementing ML pipelines and recommender systems in production environments.
  • Experience with cloud technologies (AWS, GCP, or Azure).
  • Experience with integration patterns (API, ETL, event streaming).
  • Experience providing technical leadership and guidance across multiple projects and development teams.
  • Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
  • Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
  • Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
  • Experience with Databricks or similar distributed data/ML platforms preferred.
Not Specified
Databricks Architect/ Senior Data Engineer
✦ New
🏢 OZ
Salary not disclosed
Boca Raton, FL 1 day ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Data Architect - Power & Utilities - Senior Manager- Consulting - Location OPEN
$250 +
San Francisco, CA 2 days ago

Location: Anywhere in Country


At EY, we’re all in to shape your future with confidence.


We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.


AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector

EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.


The opportunity

You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.


As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.


Your key responsibilities

In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:


As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.


Skills and attributes for success

To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:



  • Technical Skills Applications Integration
  • Cloud Computing and Cloud Computing Architecture
  • Data Architecture Design and Modelling
  • Data Integration and Data Quality
  • AI/Agentic AI driven data operations
  • Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
  • Strong relationship management and business development skills.
  • Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
  • Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
  • Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
  • Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
  • Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
  • Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
  • Establish data governance policies and practices, including data security, quality, and lifecycle management.
  • Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.

To qualify for the role, you must have

  • A Bachelor’s degree required in STEM
  • 12+ years professional consulting experience in industry or in technology consulting.
  • 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
  • 5+ years’ experience with native cloud products and services such as Azure or GCP.
  • 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
  • In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
  • Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
  • Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
  • Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
  • Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
  • Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
  • Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
  • Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
  • Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
  • Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
  • Experience leading Infrastructure and Security engineers and architects in overall platform build.
  • Excellent leadership, communication, and project management skills.
  • Data Security and Database Management
  • Enterprise Data Management and Metadata Management
  • Ontology Design and Systems Design

Ideally, you’ll also have

  • Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
  • Experience working at big 4 or a major utility.
  • Experience with cloud data platforms like Databricks.
  • Experience in leading and influencing teams, with a focus on mentorship and professional development.
  • A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
  • The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
  • Building and Managing Relationships
  • Client Trust and Value and Commercial Astuteness
  • Communicating With Impact and Digital Fluency

What we look for

We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.


FY26NATAID


What we offer you

At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.



  • We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
  • Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
  • Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.

Are you ready to shape your future with confidence? Apply today.

EY accepts applications for this position on an on‑going basis.


For those living in California, please click here for additional information.


EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.


EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.


Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.


EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.


EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.


EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .


#J-18808-Ljbffr
Not Specified
Sr. Data Engineer (Hybrid)
Salary not disclosed
Chicago, IL, Hybrid 2 days ago

Sr. Data Engineer (Hybrid)

Chicago, IL

The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.

At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.

We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.

We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.

As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.

RESPONSIBILITIES:

Data Engineering & AI Enablement

  • Build and maintain scalable data pipelines and
    ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
    cases.
  • Implement best practice patterns for ingestion,
    transformation, modeling, and orchestration within a modern lakehouse
    environment (e.g., Databricks, Delta Lake, Azure Data Lake).
  • Develop highperformance
    data models and curated datasets with strong attention to quality, usability,
    and interoperability; create reusable engineering components and automation.
  • Collaborate with the Architecture Team, the Data
    Platform Lead, and federated IT teams to optimize storage, compute, and
    architectural patterns for performance and costefficiency.
  • Build model-ready data sets and feature
    pipelines to support AI/ ML use cases; serve as a technical coordination point
    supporting business units' AI-related infrastructure needs.
  • Collaborate with data scientists and AI Working
    Group to operationalize models responsibly and maintain ongoing monitoring
    signals.

Governance, Quality & Compliance

  • Embed data governance, metadata standards,
    lineage tracking, and quality controls directly into engineering workflows;
    ensure technical implementation and alignment within engineering workflows.
  • Work with the Data Governance Lead and business
    stakeholders to operationalize stewardship, classification, validation,
    retention, and access standards.
  • Implement privacybydesign and securitybydesign
    principles, ensuring compliance with internal policies and regulatory
    obligations.
  • Maintain documentation for pipelines, datasets,
    and transformations to support transparency and audit requirements.

Platform Reliability, Observability & Optimization

  • Monitor and troubleshoot pipeline failures,
    performance bottlenecks, data anomalies, and platformlevel issues.
  • Implement observability tooling, alerts,
    logging, and dashboards to ensure endtoend reliability.
  • Support cost governance by optimizing compute
    resources, refining job schedules, and advising on efficient architecture.
  • Collaborate with the Data Platform Lead on
    scaling, configuration management, CI/CD pipelines, and environment management.
  • Collaborate with business units to understand
    data needs, translate them into engineering requirements, and deliver
    fit-for-purpose data solutions; share and apply best practices and emerging
    technologies within assigned initiatives.
  • Work with IT Security and Legal/ Compliance to
    ensure platform and datasets meet risk and regulatory standards.

Staff Management

  • Lead, mentor, and provide management oversight
    for staff.
  • Responsible for setting objectives, evaluating
    employee performance, and fostering a collaborative team environment.
  • Responsible for developing staff knowledge and
    skills to support career development.

May include other responsibilities as assigned

REQUIREMENTS:

  1. Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
  2. 5+ years of experience in data engineering within cloud environments
  3. Experience in people management preferred.
  4. Demonstrated hands-on experience with modern data platforms (Databricks preferred).
  5. Proficiency in Python, SQL, and data
    transformation frameworks.
  6. Experience designing and operationalizing
    ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
    CI/CD processes.
  7. Solid understanding of data modeling,
    structured/unstructured data patterns, and schema design.
  8. Experience implementing governance and quality
    controls: metadata, lineage, validation, stewardship workflows.
  9. Working knowledge of cloud architecture, IAM,
    networking, and security best practices.
  10. Demonstrated ability to collaborate across
    technical and business teams.
  11. Exposure to AI/ML engineering concepts, feature
    stores, model monitoring, or MLOps patterns.
  12. Experience with infrastructureascode
    (Terraform, CloudFormation) or DevOps tooling.

The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.

This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.

We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.

THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION

Apply NowShare Save Job
Remote working/work at home options are available for this role.
Not Specified
Associate Partner, Data and Technology Transformation
✦ New
$250 +
Chicago, IL 1 day ago
Introduction
Your role and responsibilities
About the Opportunity

IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.


You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.


This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.


Key Responsibilities
Market Leadership & Growth

  • Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.


  • Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.


  • Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.



Strategic Advisory & Transformation Delivery

  • Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.


  • Architect integrated solutions that include:


  • Migration from legacy data platforms to modern cloud-based architectures


  • Data engineering and Information governance


  • Business intelligence and advanced analytics


  • GenAI-powered and agentic AI-driven automation and decisioning


  • Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.



Engagement Excellence & Financial Stewardship

  • Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.


  • Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.


  • Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.



Practice Building & Talent Development

  • Recruit, mentor, and grow top-tier consultants, architects, and data specialists.


  • Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.


  • Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.



Thought Leadership & Market Presence

  • Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.


  • Represent IBM at industry conferences, client events, webinars, and executive roundtables.


  • Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.



This position can be preformed anywhere in the US.


"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."


Required technical and professional expertise
Qualifications

  • 12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.


  • Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.


  • Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:


  • AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)


  • Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)


  • Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)


  • Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.


  • Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.


  • Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.


  • Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.


  • Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.


  • Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.


  • Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.



Preferred technical and professional experience

IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.


#J-18808-Ljbffr
Not Specified
Data Integration & AI Engineer
✦ New
Salary not disclosed
Edison, NJ 1 day ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
System Administrator - Microsoft Purview (Data Catalog & Governance)
Salary not disclosed
Raleigh, NC 2 days ago
Role: System Administrator - Microsoft Purview (Data Catalog & Governance)

Location: 100% Remote

Duration: 12+ Months

Overview:

An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.

This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.

Key Responsibilities

Platform Administration & Operations:


  • Administer and operate Microsoft Purview Data Map and Data Catalog environments.
  • Monitor platform health, scan execution, metadata ingestion, and lineage availability.
  • Troubleshoot and resolve catalog, scan, and connectivity issues.
  • Perform routine maintenance, configuration updates, and service optimizations.
  • Coordinate incident resolution with internal engineering teams and Microsoft support as required.

Data Source Management & Scanning:


  • Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
  • Configure and schedule metadata scans for supported sources.
  • Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
  • Monitor scan performance, failures, and coverage; take corrective action as needed.
  • Optimize scan frequency and scope to balance cost, performance, and governance coverage.

Catalog Configuration & Metadata Management:


  • Maintain and enforce enterprise metadata standards within the Purview Catalog.
  • Manage business metadata, classifications, glossary terms, and custom attributes.
  • Ensure metadata accuracy, completeness, and consistency across data assets.
  • Support curation activities including asset certification and publishing.
  • Resolve duplicate, incomplete, or stale catalog entries.

Lineage & Discovery Enablement:


  • Enable and validate data lineage ingestion from supported data platforms.
  • Monitor lineage completeness and visibility for critical data assets.
  • Assist data consumers and stewards with lineage?based impact analysis.
  • Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.

Security, Access & Governance Controls:


  • Configure and manage Purview role?based access control (RBAC) within collections.
  • Provision and maintain access for administrators, data curators, and data stewards.
  • Enforce domain?based access controls and separation of duties.
  • Integrate Purview access with Microsoft Entra ID.
  • Support sensitivity labels and classification alignment with Microsoft Information Protection.

Compliance & Risk Support:


  • Support automated discovery of sensitive data (PII, PCI, PHI).
  • Assist risk, audit, and compliance teams with catalog evidence and reporting.
  • Validate scan coverage for regulated data domains.
  • Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).

User Support & Enablement:


  • Provide operational support to data producers, consumers, and data stewards.
  • Respond to access requests, catalog issues, and usage questions.
  • Maintain operational documentation, runbooks, and standard operating procedures.
  • Support onboarding of new data domains following established governance patterns.
  • Assist with training and adoption initiatives led by governance or architecture teams.


Required Qualifications:


  • 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
  • Hands?on experience administering Microsoft Purview Data Catalog.
  • Strong understanding of metadata management, data classification, and lineage concepts.
  • Working knowledge of Azure data services and enterprise data ecosystems.
  • Experience managing access controls and identities using Microsoft Entra ID.
  • Familiarity with regulated data environments and compliance requirements.
  • Strong troubleshooting, operational support, and documentation skills.


Preferred Qualifications:


  • Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
  • Exposure to financial services or other regulated industries.
  • Experience with PowerShell, REST APIs, or basic automation for operational tasks.
  • Prior experience supporting enterprise data governance or stewardship programs.
Not Specified
Data Product Engineer
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 2 days ago
Job Title: Marketplace Data Product Engineer

Duration: 6+ months

Location: 100% Remote

Job Overview

The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.

Key Responsibilities


  • Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
  • Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
  • Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
  • Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
  • Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
  • Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
  • Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
  • Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
  • Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
  • Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.


Required Skills & Qualifications


  • 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
  • Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
  • Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
  • Excellent communication and presentation skills, with experience translating technical concepts for business partners.
  • Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
  • Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
  • Strong collaboration and stakeholder management skills across engineering, product, and business teams.
  • Comfortable working in fast?moving environments and driving clarity through ambiguity.


Preferred Qualifications


  • Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
  • Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
  • Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
  • Background in solution architecture, customer engineering, or sales engineering.
  • Experience developing demo environments, sample apps, or repeatable platform enablement assets.
  • Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.


Not Specified
jobs by JobLookup
✓ All jobs loaded