Ceic Data Jobs in Usa
11,328 positions found
Minimum of five years experience working in analytics with hospitals and health plans.
Advanced proficiency required with VBA, SQL, Salesforce, Excel and Access.
High-level skills using web applications and all browsers; ability to teach others how to use web-based database functions.
Demonstrated experience using Microsoft Office computer applications, including Word, Access, Outlook and SharePoint.
Advanced knowledge of Excel required.
Detail-oriented with strong follow-through and ability to work independently given standard guidelines and checklists.
Good writing and communication skills.
Able to draft grammatically correct and professional email messages.
Demonstrated experience in working successfully with minimal supervision.
Must have knowledge of medical and health care terminology.
Ability to complete HIPAA training and implement high-level protections on patient information and confidentiality.
Must work effectively independently and in a team setting.
Ability to relate well with internal and external customers.
Quality/Metrics: Gather and perform analysis on data from Salesforce, Loopback, Excel, and other databases as required.
Perform data cleaning as needed to ensure data are consistent and analyzable.
Create data reports, charts, graphs and tables for regular reporting to program leads and external partners.
Export data from software systems and program tracking logs for agency reporting.
Assemble reports, papers and presentation materials as directed.
Collect data through phone and in-person interviews.
Record or transcribe data in accordance with project and funding source guidelines.
Perform literature reviews (locating, listing &/or abstracting articles).
Enter literature references into shared database (such as EndNote) Responsibilities: Data cleaning, formatting, and maintenance as needed.
Data visualization and analysis of program metrics.
Data Entry for the program(s) assigned.
Program reporting/billing/invoicing support.
Administrative duties as needed (Mailing and other assigned work) Establish and maintain systems for program accountability – reports track performance.
Attend and ensure follow up after all meetings and presentations – minutes, reports, action plans, assignments, and etc.
Monitors performance, responsibilities of field staff with respect to database management, metrics, and documents.
Reports all errors in systems, workflows, and both internal and external individuals.
Completes reporting (both internal and contractual requirements) with thorough knowledge and understanding of what is being reported.
Develops and maintains a current understanding of the Department’s Contractual Agreements.
Must have professional verbal and written skills, computer/software skills.
Assists with both internal and external customer service calls, emails, and requests.
Other Miscellaneous tasks assigned, as needed.
SQL Server database design, implementation, troubleshooting Develop, optimize, and maintain complex T-SQL queries, stored procedures, indexes, constraints; resolve performance issues, deadlocks, and contentions using traces, execution plans, and profiling.
Design, develop, test, and implement ETL/ELT processes using Talend for data extraction, transformation, and loading from diverse sources, including Salesforce CRM data.
Administer and optimize Talend environment, including job scheduling, dependencies, monitoring, automation, patches, upgrades, and performance tuning.
Integrate Salesforce data (e.g., via APIs, connectors) into SQL Server databases and data warehouses, ensuring data quality, synchronization, and real-time/ batch processing.
Collaborate face-to-face/with business stakeholders to analyze requirements, gather specifications, evaluate data sources/targets, and design solutions that improve business performance.
Lead ETL development activities, ensure code quality, provide feedback on performance.
Support enterprise data warehouse, data marts, and business intelligence initiatives; perform source data analysis and dimensional modeling.
Develop and automate processes using scripting.
Provide tier 2/3 support, evaluate production issues, recommend improvements, and participate in project planning following Agile methodologies.
Perform proactive performance optimization, and data synchronization across environments Mentor staff, recommend process enhancements, and contribute specialized knowledge across IT and business operations.
Document data integration processes, workflows, ETL designs, data mappings, technical specifications, and system configurations Manage version control, deployments Collaborate on testing (unit, integration, UAT Translated business requirements into actionable data specifications, documentation, and code solutions using Salesforce Object Manager and official documentation Reviewed Salesforce release notes, verified production deployments, and conducted feature testing across sandbox and production environments with detailed feedback submission Developed and maintained complex SOQL queries to support data team operations, reporting, and analytics needs Designed and built custom Salesforce reports to support data operations and Enhanced Care Management (ECM) programs Developed and deployed end-to-end solutions for processing health plan MIF data, enabling efficient insert, update, and reporting workflows for Lead and Case objects Performed large-scale data inserts, updates, and migrations using Salesforce Data Loader in both sandbox and production environments Extracted, analyzed, and transformed backend Salesforce data using Talend and SQL to produce accurate reports for compliance, billing, and operational needs Identified and resolved reporting discrepancies and data quality issues through root-cause analysis and targeted corrections Cleaned, standardized, and transformed referral data for mass uploads into Salesforce while enforcing validation rules and workflow requirements Created Salesforce-based error reports that enabled program teams to quickly identify and correct data entry issues Conducted data gap analyses against vendor reporting requirements and designed field transformations and new data structures to meet compliance and reporting standards Integrated offshore datasets with Salesforce records to address missing or incomplete data, improving accuracy for reporting and billing Reduced manual data entry and correction efforts by automating large-scale updates, inserts, and fixes via Salesforce Data Loader Maintained vendor zip code records in Salesforce to ensure accurate service area tracking, correct billing rates, and reliable historical reference Partners in Care Foundation is an equal opportunity employer.
We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.
It is our intent to maintain a work environment which is free of harassment, discrimination, or retaliation because of age, race (including hair texture and protective hairstyles, such as braids, locks, and twists), color, national origin, ancestry, religion, sex, sexual orientation, pregnancy (including childbirth, lactation/breastfeeding, and related medical conditions), physical or mental disability, genetic information (including testing and characteristics, as well as those of family members), veteran status, uniformed service member status, gender, gender identity, gender expression, transgender status, arrest or conviction record, domestic violence victim status, credit history, unemployment status, caregiver status, sexual and reproductive health decisions, salary history or any other status protected by federal, state, or local laws.
All qualified applicants will receive consideration for employment and reasonable accommodations may be made to enable qualified individuals to perform the essential functions of the position.
Remote working/work at home options are available for this role.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Location: Anywhere in Country
At EY, we’re all in to shape your future with confidence.
We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector
EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.
The opportunity
You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.
As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.
Your key responsibilities
In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:
As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.
Skills and attributes for success
To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:
- Technical Skills Applications Integration
- Cloud Computing and Cloud Computing Architecture
- Data Architecture Design and Modelling
- Data Integration and Data Quality
- AI/Agentic AI driven data operations
- Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
- Strong relationship management and business development skills.
- Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
- Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
- Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
- Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
- Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
- Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
- Establish data governance policies and practices, including data security, quality, and lifecycle management.
- Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.
To qualify for the role, you must have
- A Bachelor’s degree required in STEM
- 12+ years professional consulting experience in industry or in technology consulting.
- 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
- 5+ years’ experience with native cloud products and services such as Azure or GCP.
- 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
- In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
- Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
- Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
- Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
- Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
- Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
- Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
- Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
- Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
- Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
- Experience leading Infrastructure and Security engineers and architects in overall platform build.
- Excellent leadership, communication, and project management skills.
- Data Security and Database Management
- Enterprise Data Management and Metadata Management
- Ontology Design and Systems Design
Ideally, you’ll also have
- Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
- Experience working at big 4 or a major utility.
- Experience with cloud data platforms like Databricks.
- Experience in leading and influencing teams, with a focus on mentorship and professional development.
- A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
- The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
- Building and Managing Relationships
- Client Trust and Value and Commercial Astuteness
- Communicating With Impact and Digital Fluency
What we look for
We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.
FY26NATAID
What we offer you
At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.
- We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
- Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
- Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.
Are you ready to shape your future with confidence? Apply today.
EY accepts applications for this position on an on‑going basis.
For those living in California, please click here for additional information.
EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
EY | Building a better working world
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .
#J-18808-Ljbffr
Title:Sr. Manager Data Governance
Location: Richardson, TX Hybrid
Duration: 6 months possibility of FTE conversion yes
JOB SUMMARY
This position incubates and establishes a leading-edge global Data Governance function to support business segments, corporate functions and the Digital & Technology stakeholders. The responsibility includes
- Liaise directly with clients and account teams to provide strategic direction on implementation of data governance programs, best practices, adoption of standards, mast data management, and data quality improvement while leveraging leading-edge data governance tools and technology.
- Collaborate with and manage highly performing data governance and data management professionals that support occupier clients and account teams.
- Provide support on data strategy execution in the adoption of data products including enterprise data platform that provides game-changing analytics in the CRE industry.
- Serve as the data governance champion of strategic data products and supporting metadata and reference data.
- Implement and support data ownership and stewardship programs for stakeholders across the business to ensure that account teams adopt improved data governance and management practices.
ESSENTIAL DUTIES AND RESPONSIBILITIES
- Participate in the strategy, planning, and execution for Enterprise Data Governance at, focusing on Building Operations & Experience (BOE) business segment. Ensure the company has urgency, sensitivity and thought leadership for competitive capabilities around data.
- Demonstrated leadership experience in a large, complex, global organization, including the ability to effectively work and communicate across organizational lines. Ensure business stakeholder understanding, alignment and commitment to the objectives of the data governance and management program(s). Be the champion and evangelist for data, the business value, and the potential innovations. Be the trusted advisor to senior leadership and peers.
- Demonstrated experience in building relationships and leading high-performing teams with top talents around the world. Build a high performance environment and execute a people strategy that attracts, retains, develops and motivates their team by fostering an inclusive work environment, communicating vision/ values/ business strategy and managing succession and development planning for the team.
- Collaborate with partners across business segments/ business lines, regions and accounts to develop consistent data governance capabilities at all levels, influencing decisions relating to policy, practices, supporting technology, and talent development.
- Establish leading data management practices and shared services relating to data quality, data provisioning, metadata, lineage, reference data, issue management and change management.
- Implement data governance as commodity services that could be leveraged by various clients in different industries. Understand clients' appetite and risk culture in day-to-day support activities and decision-making.
- Establish account team data governance programs. Define data domains and implement business oversight via essential data governance organizations and RACI (i.e. central data governance function, Data Ownership and Stewardship Program, etc.). Establish data standards, policies and controls. Design and implement the framework, including associated processes, necessary to sustain a data control environment. Monitoring compliance with data policies and standards
- Establish account team and cross-account data quality framework necessary to enable data quality reporting, issue identification, remediation and tracking, ultimately ensuring trust and confidence in data across domains.
- Guide the client accounts to adopt the strategic data products including existing account migrations and new account transitions. Manage data to support and its clients' business intelligence and scale appropriately with business growth.
- Experience in leading and driving leading-edge data innovation initiatives including big data, cloud computing, IoT, data virtualization and federation, etc., is a plus.
- Create and implement strategic approaches, plans, timelines, preparation of business cases to ensure expedited handling of client data protection, and other data compliance and security requirements.
- Develop and implement metrics needed to monitor/ report on data governance and data management progress
- Develop communication approaches and change management strategies; determines presentation focus and emphasis and prepares board-level presentations.
- Performs other duties as assigned.
SUPERVISORY RESPONSIBILITIES
Manages the planning, organization, and controls for a major functional area or department. Position will be responsible for managing direct reports across the Americas region and working with peers across all regions, requiring flexibility in schedule. May also be responsible for matrix reports. This position requires subordinates' recommendations for staff recruitment, selection, promotion, advancement, corrective action and termination. Effectively recommends same for direct reports to next level management for review and approval. Monitors appropriate staffing levels and reports on utilization and deployment of human resources. Leads and supports staff in areas of staffing, selection, training, development, coaching, mentoring, measuring, appraising, and rewarding performance and retention. Leads by example and models behaviors that are consistent with the company's values.
QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
EDUCATION and EXPERIENCE
Bachelor's degree (BA/BS) from four-year college or university and a minimum of eight years of related experience and/or training, including five years of experience at the management level.
- 5 or more years of progressively responsible management positions in complex organizations required. Demonstrated success with high visibility projects, leaders in technology use and development, change management, budget and business case development and staff development.
- 5 or more years of related experience in related industry; commercial real estate management preferred.
- 7 or more years of data management related experience such as data analysis, data governance, enterprise information management, data modeling, and data quality management. analytics experience desired, i.e., data visualization, data analytics, data mining, business intelligence, etc.
- Candidates must have experience working in large organizations with geographically dispersed teams and complex technical environments.
- Experience in dealing with internal and external customers, service providers and vendors. Must be able to manage competing priorities. Needs to be resilient; resolving conflicts quickly to achieve desired business results.
- Bachelor's degree in business administration, Information Management, MIS, Business Intelligence and Data Science, Library Science, Computer Science or related fields; advanced degree preferred.
CERTIFICATES and/or LICENSES
None
COMMUNICATION SKILLS
- Ability to comprehend, analyze, and interpret the most complex business documents. Ability to respond effectively to the most sensitive issues. Ability to write reports, manuals, speeches and articles using distinctive style. Ability to make effective and persuasive presentations on complex topics to employees, clients, top management and/or public groups. Ability to motivate and negotiate effectively with key employees, top management, and client groups to take desired action.
- Ability to establish and maintain a high level of customer trust and confidence in the overall information and analytics space
- Excellent oral, written, and presentation communication skills. Strong negotiation and group facilitation skills; ability to move a process forward, while meeting the needs of a variety of clients.
- Excellent collaboration, influence and leadership skills. Ability to work with various levels of peers including analysts, developers and executives regarding complex business and data related issues.
- Relationship management skills that include excellent listening and consultative capability, the ability to influence and negotiate with business and technology partners to drive change, and the ability to take a broad perspective and make key connections
FINANCIAL KNOWLEDGE
- Requires basic knowledge of financial terms and principles.
- Participates in complex financial/business analysis and report reviews prepared peers or leaders.
- Manages to and oversees department budget.
REASONING ABILITY
- Ability to solve advanced problems and deal with a variety of options in complex situations. Requires expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based environment. Draws upon the analysis of others and makes recommendations that have a direct impact on the company.
- Understanding of global organizational design and the ability to shape and drive large-scale, cross-functional programs around people, technology, processes, and tools.
- Demonstrated ability to balance long-term strategy with quick wins.
- Demonstrated ability for strategic influencing and education of cross-functional stakeholders about the strategic importance and value of data governance
- Excellent managerial skills; collaborative, imaginative, resourceful, reliable, technically savvy.
- Superior analytical and creative problem-solving skills. Demonstrated successes in data analysis, drawing conclusions and improvement. Apply listening and consultative skills to understand business needs; be able to interpret requirements, identify impacts and analyze problems to determine impacts to business processes across the organization.
- Ability to work well under deadlines, ability to work in a multi-tasking production environment to make good judgments about competing priorities.
- Ability to tell a story to explain or sell a concept.
OTHER SKILLS and/or ABILITIES
- Utilizes an entrepreneurial approach and develops innovative solutions.
- Ability to write business cases, process maps, presentation materials and articles using distinctive style.
- Ability to make effective and persuasive presentations on complex topics across various levels of leadership
- Expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based multi-industry sales environment.
- Ability to use strong conceptual and analytical skills to generate insights and recommendations.
- Demonstrated information management and quantitative skills, including working knowledge of IT infrastructure, various technologies/ platforms, and aligned vendor solutions with enterprise strategic priorities.
- Experience managing small to mid-size teams and delivering results.
- Thorough knowledge of cutting-edge data management tools, industry advances, etc.
- Superior project management/ consulting and leadership skills. Demonstrated ability to facilitate complex, mission critical projects and to develop, participate in and guide multi-disciplinary work teams. Manage task timelines and deliverable schedules and share concerns about deliverables, timelines, and issues with Data Governance services or deliverables.
- Superior ability to manage, manipulate and analyze raw data, draw conclusions, and develop actionable recommendations using technology. Articulate the issues and resolutions via business-friendly communications. Serve as primary day-to-day contact for regional data management issues.
- Advanced understanding of data quality management. Knowledge of data governance and how it impacts business processes.
- Knowledge of master data management in a global environment, including data lifecycle and maintenance processes.
- Skills in MS Visio, Word and PowerPoint is a plus.
- Experience with reference data management tools at including Collibra, MS Excel, SQL query, etc., is a plus.
- Software development lifecycle knowledge, with background in agile philosophies and
Job Title: Distribution and Marketing Data Product Manager
Division: Beazley Shared Services - Data Management
Location: Multiple Locations, US
Hybrid Role
Reports To: Head of Data Products
Key Relationships: Chief Data Office, Data Leadership Team, Data Owners, Distribution and Marketing, CRM, Data Governance and Quality, Data Stewards, Data Architects, Delivery Team members, Technology Team, Finance, Underwriting, Operations and other Business Stakeholders
Beazley:
Beazley is a global specialist insurance company with over 30 years' experience helping people,
communities, and businesses to manage risk all around the world. Our products are wide ranging from cyber & tech to marine, healthcare, financial institutions, and contingency, covering risks like the weather, film production or protection from deadly weapons.
We are a flexible and innovative employer offering a friendly, collaborative, diverse and inclusive work environment. We encourage applications from all backgrounds. Collaboration in office spaces is important and we use a hybrid approach with a minimum of 2 days in the office per week.
We have a wonderful mix of cultures, experiences, and backgrounds at Beazley with over 1500 of us working around the world. Employee's diversity, experience and passion allow us to keep innovating and moving forward, delivering the best. We hire people with wide perspectives, and we have set bold diversity targets as we work towards excellence.
Data @ Beazley:
Our Data team supports Beazley's vision by...
* Being bold through pioneering & championing an exciting vision of how people interact with data
* Facilitating innovation by leading the pace of change in data & analytics, and facilitating the latest capabilities and innovative technologies
* Doing the right thing by providing a controlled working data environment that allows all business domains to thrive independently
* Being the single source of truth for enterprise-wide reporting metrics and KPIs
Our Data team is located at multiple offices across UK, Europe and the US. The specified home office location options provide the best balance for being co-located with key Data Office colleagues and business stakeholders.
The Role:
Data is one of Beazley's greatest assets and this roles is critical to supporting our Distribution and Marketing insights, which includes Customer, Broker and Marketing data. We're seeking a strategic and technically savvy Data Product Manager to lead the strategy, development and evolution of data products and insights that empower our distribution and marketing teams. This role is critical to aligning our data, unlocking insights, and informing growth opportunities across our specialty portfolio. In this role, you will also work to mature data literacy and capabilities as Beazley undertakes a significant investment in modernization, enabling you to embed a culture of data excellence and innovation in our delivery.
Key Responsibilities:
Partner with the global Distribution and Marketing team to understand, prioritize and develop data products and insights that support their business strategy.
Build and own a roadmap to provide regular updates on delivery commitments for data products, insights, enhancements and queries.
Manage stakeholder relationships to support the growth strategy for Beazley customers, brokers, teams and products.
Produce insights and key data trendsthat highlight business performance, RoI, efficiencies and game-changing growth opportunities.
Inspire the adoption and use of insights to drive decisions in investment and operations that improve efficiency and drive growth by leading demonstrations and hands on training sessions.
Lead a team of Product Owners, Product Analysts, Business Analysts and a development team to deliver and maintain data products and insights; maintaining a backlog of work within Jira.
Represent the business in data governance discussions, escalating issues as appropriate.
Ensure that data product development considers policy, methodology and standards, and ensure these are adhered to during product development.
Evaluate the performance of your data product portfolio against KPIs defined by the business and provide feedback on the value delivered.
Proactively anticipate business needs and look for opportunities to bring innovation or new approaches into the user design, experience, product development and insights.
Relentlessly focus on the Distribution and Marketing team as a customer, delivering high quality data and insights that are clear and inspire action.
Partner with the Data Governance Group and CRM solution team (Customer Relationship Management) to drive improvements in our Customer and Broker data quality through MDM and other tools.
Provide leadership, direction, development and support to direct reports (including off-shore resources).
Essential Criteria:
Bachelor's degree in Business, Marketing, Data Science, Computer Science, Economics, Statistics or related field; Master's degree preferred
Proven experience in data product management, marketing analytics or distribution strategy, preferably in insurance or financial services
Experience working with data, building data models, and sharing insights
Skills and Abilities:
Strategic and curious with the ability to design and develop data and insights that support our Distribution and Marketing team's goals, planning, performance and incentives that drive growth
Understand the specialty insurance market, customer segmentation and distribution channels, with experience in North America, Lloyd's, Retail and Wholesale markets preferred
Ability to lead workshops that help your stakeholders identify data needs and articulate their desired user experience, with the ability to build dashboards preferred
Strong organization and communication skills with the ability to direct work, document requirements and present demos
Advanced technical skills with the ability to dive into the data, identify anomalies, and provide high quality, trusted data
Understanding of Specialty Insurance principles and key drivers to create opportunities, loyalty and growth
Knowledge and Experience:
Experience in Data Products, Data Analytics, Data Science, Statistics, Economics or related fields in Insurance, Financial or sales organizations preferred
Strong understanding of MDM and CRM systems and their use with Customer and Broker data
Proficiency in data visualization (Power BI), analytics platform (Snowflake), dashboard design and data storytelling
Experience working with insurance data, and in particular a strong understanding of pipeline intelligence for sales growth/ targeting and performance
Ability to use predictive modeling to drive an understanding of performance, customer behavior, and prospective renewals/ growth to help the Distribution Sales team focus on the best opportunities
Experience managing relationships and teams of stakeholders, business analysts, data analysts, data architects, data modelers, data engineers and testers using agile processes
Skills in data engineering technologies like Kafka, Snowflake / Snowpark, DataBricks, Jira and Agile principles
Experience in managing and manipulating large internal and external datasets
Knowledge of relational and dimensional database structures, theories, principles, and practices
Driven and proven team player with ability to work with all levels in a highly intellectual, collaborative, and fast paced environment
Excellent communication skills, with the ability to tailor them appropriately for different audiences, technical backgrounds, and seniority
Who We Are:
Beazley is a specialist insurance company with over 30 years' experience helping people, communities and businesses to manage risk all around the world. Our mission is to inspire our clients and people with the confidence and freedom to explore, create and build - to enable businesses to thrive. Our clients want to live and work freely and fully, knowing they are benefitting from the most advanced thinking in the insurance market. Our goal is to become the highest performing sustainable specialist insurer.
Our products are wide ranging, from cyber & tech insurance to marine, healthcare, financial institutions and contingency; covering risks such as the weather, film production or protection from deadly weapons.
Our Culture
We have a wonderful mix of cultures, experiences, and backgrounds at Beazley with over 2,000 of us working around the world. Employee's diversity, experience and passion allow us to keep innovating and moving forward, delivering the best. We are proud of our family-feel culture at Beazley that empowers our staff to work from when and where they want, in an adult environment that is big on collaboration, diversity of thought and personal accountability. Our three core values inspire the way we work and how we treat our people and customers.
- Be bold
- Strive for better
- Do the right thing
Upholding these values every day has enabled us to become an innovative and responsive organization in touch with the changing world around us - our ambitious inclusion & diversity and sustainability targets are testament to this.
We are a flexible and innovative employer offering a friendly, collaborative, and inclusive working environment. We actively encourage and expect applications from all backgrounds. Our commitment to fostering a supportive and dynamic workplace ensures that every employee can thrive and contribute to our collective success.
Explore a variety of networks to assist with professional and/or personal development. Our Employee Networks include:
- Beazley RACE - Including, understanding and celebrating People of Colour
- Beazley SHE - Successful, High potential, Empowered women in insurance
- Beazley Proud - Our global LGBTQ+ community
- Beazley Wellbeing - Supporting employees with their mental wellbeing
- Beazley Families - Supporting families and parents-to-be
We encourage internal career progression at Beazley, giving you all the tools you need to drive your own career here, such as:
- Internal Pathways (helping you grow into an underwriting role)
- iLearn (our own learning & development platform)
- LinkedIn Learning
- Mentorship program
- External qualification sponsorship
- Continuing education and tuition reimbursement
- Secondment assignments
The Rewards
- The opportunity to connect and build long-lasting professional relationships while advancing your career with a growing, dynamic organization
- Attractive base compensation and discretionary performance related bonus
- Competitively priced medical, dental and vision insurance
- Company paid life, and short- and long-term disability insurance
- 401(k) plan with 5% company match and immediate vesting
- 22 days PTO (prorated for 1st calendar year of employment), 11 paid holidays per year, with the ability to flex the religious bank holidays to suit your religious beliefs
- Up to $700 reimbursement for home office setup
- Free in-office lunch, travel reimbursement for travel to office, and monthly lifestyle allowance
- Up to 26 weeks of fully paid parental leave
- Up to 2.5 days paid annually for volunteering at a charity of your choice
- Flexible working policy, trusting our employees to do what works best for them and their teams
Salary for this role will be tailored to the successful individual's location and experience. The expected compensation range for this position is $130,000-$150,000 per year plus discretionary annual bonus.
Don't meet all the requirements? At Beazley we're committed to building a diverse, inclusive, and authentic workplace. If you're excited about this role but your experience doesn't perfectly align with every requirement and qualification in the job specification, we encourage you to apply anyway. You might just be the right candidate for this, or one of our other roles.
We are an equal opportunities employer and as such, we will make reasonable adjustments to our selection process for candidates that indicate that, owing to disability, our arrangements might otherwise disadvantage them. If you have a disability, including dyslexia or other non-visible ones, which you believe may affect your performance in selection, please advise us in good time and we'll make reasonable adjustments to our processes for you.
LocationAtlanta, Georgia
Full/Part TimeFull-Time
Regular/TemporaryRegular
Add to Favorite JobsEmail this Job
About Us
Overview
Georgia Tech prides itself on its technological resources, collaborations, high-quality student body, and its commitment to building an outstanding and diverse community of learning, discovery, and creation. We strongly encourage applicants whose values align with our institutional values, as outlined in our Strategic Plan. These values include academic excellence, diversity of thought and experience, inquiry and innovation, collaboration and community, and ethical behavior and stewardship. Georgia Tech has policies to promote a healthy work-life balance and is aware that attracting faculty may require meeting the needs of two careers.
About Georgia Tech
Georgia Tech is a top-ranked public research university situated in the heart of Atlanta, a diverse and vibrant city with numerous economic and cultural strengths. The Institute serves more than 45,000 students through top-ranked undergraduate, graduate, and executive programs in engineering, computing, science, business, design, and liberal arts. Georgia Tech's faculty attracted more than $1.4 billion in research awards this past year in fields ranging from biomedical technology to artificial intelligence, energy, sustainability, semiconductors, neuroscience, and national security. Georgia Tech ranks among the nation's top 20 universities for research and development spending and No. 1 among institutions without a medical school.
Georgia Tech's Mission and Values
Georgia Tech's mission is to develop leaders who advance technology and improve the human condition. The Institute has nine key values that are foundational to everything we do:
1. Students are our top priority.
2. We strive for excellence.
3. We thrive on diversity.
4. We celebrate collaboration.
5. We champion innovation.
6. We safeguard freedom of inquiry and expression.
7. We nurture the wellbeing of our community.
8. We act ethically.
9. We are responsible stewards.
Over the next decade, Georgia Tech will become an example of inclusive innovation, a leading technological research university of unmatched scale, relentlessly committed to serving the public good; breaking new ground in addressing the biggest local, national, and global challenges and opportunities of our time; making technology broadly accessible; and developing exceptional, principled leaders from all backgrounds ready to produce novel ideas and create solutions with real human impact.
Department Information
The Office of Institutional Research and Planning (IRP) at Georgia Tech is a research and analytics service unit dedicated to supporting the campus community. Our team of institutional research and data analytics professionals combines technical and creative skills to inform institutional strategic decision-making, planning, and research across campus. In addition to institutional reporting and compliance, IRP provides data education, support, and resources to all campus units.
Visit our website to learn more about what we do:
Job Summary
Data Analysts analyze data, interpret trends and patterns, and provide insights to support decision-making processes. They develop data models, perform data mining and statistical analysis, and collaborate with stakeholders to optimize data-driven strategies.
Responsibilities
Job Duty 1 -
Collect, analyze, and interpret data from various sources, databases, and systems to extract insights, trends, and patterns that inform business decisions, strategies, and operations.
Job Duty 2 -
Develop and maintain data models, queries, and reports using SQL, Python, R, or data analysis tools to perform data cleansing, transformation, and visualization tasks.
Job Duty 3 -
Identify data quality issues, anomalies, and discrepancies in datasets, conduct data validation, data profiling, and data integrity checks to ensure data accuracy and reliability.
Job Duty 4 -
Create data visualizations, dashboards, and data analytics reports to communicate data findings, trends, and key metrics to stakeholders, management, and decision-makers.
Job Duty 5 -
Conduct ad-hoc data analysis, exploratory data analysis, and statistical analysis to support decision-making processes, performance monitoring, and data-driven insights.
Job Duty 6 -
Perform data mining, predictive analytics, and machine learning tasks to uncover hidden patterns, predict outcomes, and drive data-driven decision-making in organizations.
Job Duty 7 -
Utilize data analytics tools, business intelligence platforms, and statistical software packages to conduct data analysis, data modeling, and data visualization tasks efficiently and accurately.
Job Duty 8 -
Stay current on data analytics trends, tools, and methodologies through training, certifications, and industry publications to enhance data analysis skills and knowledge.
Job Duty 9 -
Collaborate with business users, data scientists, and Information Technology teams to define data requirements, analytics requirements, and data-driven solutions for business problems and opportunities.
Job Duty 10 -
Perform other job-related duties as assigned.
Responsibilities
The Institutional Research Data Analyst will also be expected to perform various duties specific to institutional research, including but not limited to:
- Responding to intermediate to high difficulty/complexity ad-hoc data and analysis requests
- Adhering to federal, state, and institutional policies, regulations, and requirements related to data security, privacy, and governance
Completing or supporting the completion of externally-driven compliance and data-related reporting including
- Federal, e.g., IPEDS, NSF-HERD, NSF-GSS, etc.
- State, e.g., USG data collections, data requests, etc.
- Higher education organizations, e.g., AAUDE, SREB, NSC, accrediting bodies, etc.
Required Qualifications
Educational Requirements
Bachelor's Degree in related discipline or equivalent combination of education and experience. Advanced certification may be preferred or required (some profiles may require additional education).
Required Experience
Four or more years of relevant experience.
Proposed Salary
Annual Salary Range: $75,751 to $80,000
Knowledge, Skills, & Abilities
SKILLS
o Performs all the standard and technical aspects of the job
o Applies in-depth professional, technical, or industry knowledge to manage significantly complex
assignments/projects/programs
o Advanced knowledge of principles and practices of a particular field of specialization and Institute
policies, practices, and procedures
USG Core Values
The University System of Georgia is comprised of our 25 institutions of higher education and learning as well as the System Office. Our USG Statement of Core Values are Integrity, Excellence, Accountability, and Respect. These values serve as the foundation for all that we do as an organization, and each USG community member is responsible for demonstrating and upholding these standards. More details on the USG Statement of Core Values and Code of Conduct are available in USG Board Policy 8.2.18.1.2 and can be found on-line at policymanual/section8/C224/#p8.2.18_personnel_conduct.
Additionally, USG supports Freedom of Expression as stated in Board Policy 6.5 Freedom of Expression and Academic Freedom found on-line at policymanual/section6/C2653.
Equal Employment Opportunity
The Georgia Institute of Technology (Georgia Tech) is an Equal Employment Opportunity Employer. The Institute is committed to maintaining a fair and respectful environment for all. To that end, and in accordance with federal and state law, Board of Regents policy, and Institute policy, Georgia Tech provides equal opportunity to all faculty, staff, students, and all other members of the Georgia Tech community, including applicants for admission and/or employment, contractors, volunteers, and participants in institutional programs, activities, or services. Georgia Tech complies with all applicable laws and regulations governing equal opportunity in the workplace and in educational activities.
Equal opportunity and decisions based on merit are fundamental values of the University System of Georgia ("USG") and Georgia Tech. Georgia Tech prohibits discrimination, including discriminatory harassment, on the basis of an individual's race, ethnicity, ancestry, color, religion, sex (including pregnancy), national origin, age, disability, genetics, or veteran status in its programs, activities, employment, and admissions. Further, Georgia Tech prohibits citizenship status, immigration status, and national origin discrimination in hiring, firing, and recruitment, except where such restrictions are required in order to comply with law, regulation, executive order, or Attorney General directive, or where they are required by Federal, State, or local government contract.
Other Information
This is not a supervisory position.
This position does not have any financial responsibilities.
This position will not be required to drive.
This role is not considered a position of trust.
This position does not require a purchasing card (P-Card).
This position will not travel
This position does not require security clearance.
Background Check
Successful candidate must be able to pass a background check. Please visit employment/pre-employment-screening
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
Senior Data Modeler
Hybrid 3-4 days onsite
Location: Phoenix, Arizona
Salary: $130,000 - $150,000 base
A large, operationally complex organization is undergoing a major modernization of its data platform and is building a new, cloud-native analytics foundation from the ground up. This is a greenfield opportunity for a senior-level data modeler to establish best practices, influence architecture, and help shape how data is organized and used across the business.
This role sits at the center of a multi-year transformation focused on modern analytics, scalable data products, and strong collaboration between data and business teams.
What You’ll Be Working On
- Designing and implementing enterprise data models across conceptual, logical, and physical layers
- Establishing Medallion architecture patterns and reusable modeling assets
- Building dimensional and semantic models that support analytics and reporting
- Partnering closely with domain experts and functional leaders to translate business needs into data structures
- Collaborating with data engineers to align models with ELT pipelines and analytics frameworks
- Helping define modeling standards and upskilling senior engineers in modern data modeling practices
- Contributing hands-on to data engineering work where needed (SQL, transformations, optimization)
- Proactively identifying analytics opportunities and recommending data structures to support them
This role is roughly 40% data modeling, 30% hands-on engineering, and 30% cross-functional collaboration.
Must-Have Experience
- Strong, hands-on experience with data modeling (dimensional, canonical, semantic)
- Deep understanding of Medallion architecture
- Advanced SQL and experience working with a modern cloud data warehouse
- Experience with dbt for transformations and modeling
- Hands-on experience in cloud-native data environments (AWS preferred)
- Ability to work directly with business stakeholders and explain technical concepts clearly
- Experience collaborating closely with data engineers on execution
Nice to Have
- Python experience
- Familiarity with Informatica or reverse-engineering legacy data models
- Exposure to streaming or near-real-time data pipelines
- Experience with visualization tools (tool choice is flexible)
Who Will Thrive in This Role
- A senior individual contributor who enjoys building from scratch
- Someone who can act as a modeling expert and mentor in an organization formalizing this practice
- Comfortable working in ambiguity and taking initiative
- Strong communicator who enjoys partnering with both technical and non-technical teams
- Equally comfortable discussing business concepts and physical data models
Why This Role Is Unique
- Greenfield data modeling initiative with real influence
- Opportunity to define standards that will be used across the organization
- Work on large-scale, real-world operational and analytical data
- High visibility within a growing data organization
- Flexible work setup for individual contributors
If you’re excited about shaping a modern data foundation and want to be the person who defines how data is modeled, understood, and used, this is a rare opportunity to make a lasting impact.
Job Title – Lead Data Engineer
Please note this role is not able to offer visa transfer or sponsorship now or in the future
About the role
As a Lead Data Engineer, you will make an impact by designing, building, and operating scalable, cloud‑native data platforms supporting batch and streaming use cases, with strong focus on governance, performance, and reliability. You will be a valued member of the Data Engineering team and work collaboratively with cross‑functional engineering, cloud, and architecture stakeholders.
In this role, you will:
- Design, build, and operate scalable cloud‑native data platforms supporting batch and streaming workloads with strong governance, performance, and reliability.
- Develop and operate data systems on AWS, Azure, and GCP, designing cloud‑native, scalable, and cost‑efficient data solutions.
- Build modern data architectures including data lakes, data lakehouses, and data hubs, with strong understanding of ingestion patterns, data governance, data modeling, observability, and platform best practices.
- Develop data ingestion and collection pipelines using Kafka and AWS Glue; work with modern storage formats such as Apache Iceberg and Parquet.
- Design and develop real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks, with understanding of event‑driven architectures and low‑latency data processing.
- Perform data transformation and modeling using SQL‑based frameworks and orchestration tools such as dbt, AWS Glue, and Airflow, including Slowly Changing Dimensions (SCD) and schema evolution.
- Use Apache Spark extensively for large‑scale data transformations across batch and streaming workloads.
Work model
We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 4 days a week in a client or Cognizant office in Atlanta, GA. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.
The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.
What you need to have to be considered
- Hands‑on experience developing and operating data systems on AWS, Azure, and GCP.
- Proven ability to design cloud‑native, scalable, and cost‑efficient data solutions.
- Experience building data lakes, data lakehouses, and data hubs with strong understanding of ingestion patterns, governance, modeling, observability, and platform best practices.
- Expertise in data ingestion and collection using Kafka and AWS Glue, with experience in Apache Iceberg and Parquet.
- Strong experience designing and developing real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks.
- Deep expertise in data transformation and modeling using SQL‑based frameworks and orchestration tools including dbt, AWS Glue, and Airflow, with knowledge of SCD and schema evolution.
- Extensive experience using Apache Spark for large‑scale batch and streaming data transformations.
These will help you stand out
- Experience with event‑driven architectures and low‑latency data processing.
- Strong understanding of schema evolution, SCD modeling, and modern data modeling concepts.
- Experience with Apache Iceberg, Parquet, and modern ingestion/storage patterns.
- Strong knowledge of observability, governance, and platform best practices.
- Ability to partner effectively with cloud, architecture, and engineering teams.
Salary and Other Compensation:
Applications will be accepted until March 17, 2025.
The annual salary for this position is between $81,000 - $135,000, depending on experience and other qualifications of the successful candidate.
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.
Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
- Medical/Dental/Vision/Life Insurance
- Paid holidays plus Paid Time Off
- 401(k) plan and contributions
- Long‑term/Short‑term Disability
- Paid Parental Leave
- Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
The pay range for this role is $150,000 - $200,000/yr USD.
WHO WE ARE:
Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.
ABOUT THE ROLE:
Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.
The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).
You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.
WHAT YOU’LL DO:
- Responsible for the full technical life cycle of consumer platform capabilities which includes:
- Capability roadmap and technical architecture in alignment to consumer experience
- Technical planning, design, and execution
- Operations, analytics/reporting, and adoption
- Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
- Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
- Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
- Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
- Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
- Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
- Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
- Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
- Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.
REQUIREMENTS:
- Computer Science, Data Engineering, or related degree or equivalent experience.
- 12+ years experience architecting enterprise data platforms in cloud environments.
- 9+ years experience with data engineering with a focus on consumer data.
- 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
- Strong experience with Data 360 and identity resolution architectures.
- Proven expertise in SQL performance tuning and large-scale data modeling.
- Hands-on experience implementing ML pipelines and recommender systems in production environments.
- Experience with cloud technologies (AWS, GCP, or Azure).
- Experience with integration patterns (API, ETL, event streaming).
- Experience providing technical leadership and guidance across multiple projects and development teams.
- Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
- Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
- Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
- Experience with Databricks or similar distributed data/ML platforms preferred.
Title : Data QA Engineer
Location: Minneapolis , Dallas , Atlanta (Onsite)
Job Type : Contract
Exp : 8-15 Years
Key Responsibilities:
- Design, build, and maintain automated data quality frameworks to validate accuracy, completeness, consistency, and timeliness of data.
- Develop automation scripts using Python/SQL to test data pipelines, ETL/ELT processes, and analytics workflows.
- Implement data quality checks and monitoring within Azure-based data platforms.
- Work extensively with Azure services (ADF, ADLS, Synapse) and Databricks for large-scale data processing.
- Integrate data quality validations into CI/CD pipelines and support proactive issue detection.
- Perform root cause analysis for data issues and collaborate with data engineering, analytics, and business teams to resolve them.
- Define and enforce data quality standards, metrics, and SLAs.
Required Skills & Qualifications:
- Strong experience (8–15 years) in data engineering, data quality, or data automation roles.
- Hands-on expertise with Azure data ecosystem and Databricks.
- Strong programming skills in Python and SQL.
- Experience building automated data validation and reconciliation frameworks.
- Solid understanding of data warehousing, data lakes, and distributed data processing.
- Familiarity with DevOps/CI-CD practices for data platforms.
Preferred Skills:
- Experience with data observability or data quality tools.
- Exposure to cloud-scale analytics and performance optimization.
- Strong communication and stakeholder management skills.
Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions - from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers.
The Strategic Data(RWD) Acquisition Manager will be an integral part of Surescripts' data ecosystem by executing negotiations with Surescripts Network Alliance partners to secure data usage rights, while also identifying and acquiring new, strategic data sources. This person will play a critical role in maintaining access to high quality data necessary for the development of solutions that will deliver value and improve the experience for stakeholders across the healthcare ecosystem. This position requires a deep understanding of healthcare data, the regulatory landscape and business development experience to successfully negotiate and secure data agreements that will enhance our product portfolio.
Responsibilities:- Identify and evaluate potential data sources of interest that expand Surescripts' data portfolio. Create comprehensive value propositions for how the data could be used within Surescripts' solutions, and valuation of the data to make offers to data sources for data acquisition.
- Drive business development efforts to secure agreements that enhance Surescripts' data portfolio. With guidance from leadership, execute strategies to identify and approach potential data partners, and successfully negotiate terms.
- Collaborate with sales and product teams to develop strategies to align customer incentives with broader data-dependent initiatives. Interface with Surescripts Network Alliance partners to negotiate data usage rights, ensuring alignment with business goals and regulatory requirements.
- Interface with data providers, industry partners, and other stakeholders.
- Manage day-to-day data procurement-related inquiries and negotiations with data providers and customers.
- Maintain a thorough understanding of privacy laws, including HIPAA permitted purposes. Collaborate with compliance, privacy, security, and data governance teams to ensure all data procurement activities comply with all state and federal regulations, internal policies, and customer contracts.
- Monitor and report on data procurement activities. Track progress of data procurement efforts, report on key metrics, and provide regular updates to senior management. Proactively identify and address any challenges or obstacles in the procurement process. Monitor and evaluate the ROI of data acquisition initiatives to prioritize high-impact opportunities.
- Keep up-to-date with the latest developments in data rights, privacy regulations, and the healthcare industry. Apply and share this knowledge to improve data procurement strategies and ensure the company remains compliant and competitive.
Qualifications:
Basic Requirements:
- Bachelor's degree in Business, Economics, Data Science, or related field;
- 8+ years of experience in business development and/or related experience in the procurement/acquisition of healthcare data.
- Strong understanding of regulations around healthcare data, including Health Insurance Portability and Accountability Act (HIPAA) and Trusted Exchange Framework and Common Agreement (TEFCA).
- Ability to evaluate the value and quality of data assets and their applicability to business needs.
- Proven experience in negotiating contracts and managing vendor relationships.
- Demonstrated success in business development and deal negotiation.
- Excellent written and verbal communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Ability to travel for team, customer and vendor meetings as needed.
- Strategic thinker with strong analytical and problem-solving abilities and results-driven mindset.
Preferred Qualifications:
- MBA or advanced degree preferred in a related field.
- Strong understanding of healthcare interoperability standards, such as Fast Healthcare Interoperability Resource (FHIR).
- Strong understanding of electronic health records (EHR), pharmacy and claims data, health information exchanges (HIE), and TEFCA qualified health information networks (QHINs)
- Familiarity with data governance tools (e.g. data mapping, lineage
#LI-remote
Surescripts embraces flexibility through its Flexible Hybrid Work model for most positions. This model allows employees to work virtually while still utilizing our offices as collaboration centers. With alignment and agreement from your leadership, you can come and go from the office as needed.
To be considered for employment, applicants must have a valid U.S. work authorization allowing work without restrictions with Surecripts in the U.S. At this time, we are unable to provide support or provide sponsorship for immigration benefits such as work visas. Additionally, we do not participate in academic training programs or work-study programs through an academic institution that require employer endorsement of F-1/CPT or F-1/STEM.
Why Wait? Apply Now
We're a midsize company. This means you're not just another employee ID number. Here, you can build real relationships and feel supported by truly awesome people with diverse backgrounds and talents in an innovative and collaborative work culture. We strive to create an environment where you can be yourself, share your ideas and work your way. We offer opportunities for employee development, as well as competitive compensation packages and extensive benefits.
Benefits include, but are not limited to, comprehensive healthcare (including infertility coverage), generous paid time off including paid childbirth and parental leave and mental health days, pet insurance, and 401(k) with company match and immediate vesting. To learn more, review the Keep You and Yours Healthy, Balancing Work and Life, and Where Talent Takes Shape links under the Better Benefits. Better Work. Better Life section of our careers site.
While performing duties of this job, an employee may be required to perform any, or all of the following: attend meetings in and out of the office, travel, communicate effectively (both orally and in writing), and be able to effectively use computers and other electronic and standard office equipment with, or without, a reasonable accommodation. Additionally, this job requires certain mental demands, including the ability to use judgement, withstand moderate amounts of stress and maintain attention to detail with, or without, a reasonable accommodation.
Surescripts is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, color, religion, age, national origin, ancestry, disability, medical condition, marital status, pregnancy, genetic information, gender, sexual orientation, parental status, gender identity, gender expression, veteran status, or any other status protected under federal, state, or local law.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
We are seeking a Staff Data Engineer to lead the design, implementation, and evolution of our identity services and data governance platform. This role is critical to ensuring trusted, privacy-safe, and well-governed data across the organization. You will work at the intersection of data engineering, identity resolution, privacy, and platform reliability.This is anindividual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Identity Services:
- Design and maintain a scalable identity resolution platform
- Build pipelines and services to ingest, normalize, link, and version identity data across multiple sources
- Ensure deterministic and probabilistic matching logic that is transparent, auditable, and measurable
- Partner with product and analytics teams to expose identity data through reliable, well-documented APIs and datasets
- Build and operate batch and streaming pipelines using modern data stack tools
- Create clear documentation, standards, and runbooks for identity and governance systems
- Data Governance & Trust
- Own data governance foundations including data lineage, quality checks, schema enforcement, and access controls
- Implement privacy-by-design principles (PII handling, consent enforcement, retention policies)
- Collaborate with legal, privacy, and security teams to operationalize regulatory requirements (e.g., GDPR, CCPA)
- Establish monitoring and alerting for data quality, freshness, and integrity
What we're looking for:
- Data engineering experience with proven track record building data infrastructure using Spark with Scala
- Proven experience building data infrastructure using Spark with Scala for at least 5 years
- Experience in delivering significant technical initiatives and building reliable, large scale services
- Experience in delivering APIs backed by relationship-heavy datasets
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Successful design and implementation of scalable and efficient data infrastructure
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$155,584—$320,320 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
Location: 100% Remote
Duration: 12+ Months
Overview:
An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.
This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.
Key Responsibilities
Platform Administration & Operations:
- Administer and operate Microsoft Purview Data Map and Data Catalog environments.
- Monitor platform health, scan execution, metadata ingestion, and lineage availability.
- Troubleshoot and resolve catalog, scan, and connectivity issues.
- Perform routine maintenance, configuration updates, and service optimizations.
- Coordinate incident resolution with internal engineering teams and Microsoft support as required.
Data Source Management & Scanning:
- Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
- Configure and schedule metadata scans for supported sources.
- Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
- Monitor scan performance, failures, and coverage; take corrective action as needed.
- Optimize scan frequency and scope to balance cost, performance, and governance coverage.
Catalog Configuration & Metadata Management:
- Maintain and enforce enterprise metadata standards within the Purview Catalog.
- Manage business metadata, classifications, glossary terms, and custom attributes.
- Ensure metadata accuracy, completeness, and consistency across data assets.
- Support curation activities including asset certification and publishing.
- Resolve duplicate, incomplete, or stale catalog entries.
Lineage & Discovery Enablement:
- Enable and validate data lineage ingestion from supported data platforms.
- Monitor lineage completeness and visibility for critical data assets.
- Assist data consumers and stewards with lineage?based impact analysis.
- Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.
Security, Access & Governance Controls:
- Configure and manage Purview role?based access control (RBAC) within collections.
- Provision and maintain access for administrators, data curators, and data stewards.
- Enforce domain?based access controls and separation of duties.
- Integrate Purview access with Microsoft Entra ID.
- Support sensitivity labels and classification alignment with Microsoft Information Protection.
Compliance & Risk Support:
- Support automated discovery of sensitive data (PII, PCI, PHI).
- Assist risk, audit, and compliance teams with catalog evidence and reporting.
- Validate scan coverage for regulated data domains.
- Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).
User Support & Enablement:
- Provide operational support to data producers, consumers, and data stewards.
- Respond to access requests, catalog issues, and usage questions.
- Maintain operational documentation, runbooks, and standard operating procedures.
- Support onboarding of new data domains following established governance patterns.
- Assist with training and adoption initiatives led by governance or architecture teams.
Required Qualifications:
- 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
- Hands?on experience administering Microsoft Purview Data Catalog.
- Strong understanding of metadata management, data classification, and lineage concepts.
- Working knowledge of Azure data services and enterprise data ecosystems.
- Experience managing access controls and identities using Microsoft Entra ID.
- Familiarity with regulated data environments and compliance requirements.
- Strong troubleshooting, operational support, and documentation skills.
Preferred Qualifications:
- Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
- Exposure to financial services or other regulated industries.
- Experience with PowerShell, REST APIs, or basic automation for operational tasks.
- Prior experience supporting enterprise data governance or stewardship programs.