Ceic Data Full Form Jobs in Usa
21,766 positions found — Page 6
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Who We Are
At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.
What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.
Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?
Role Summary:
The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.
Responsibilities:
Data Architecture & Tooling
- Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
- Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
- Ensure alignment between data infrastructure and evolving business needs across distribution operations
- Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite
Data Governance & Quality
- Define and enforce data standards, metric definitions, and naming conventions across all business domains
- Establish data ownership, lineage documentation, and access governance policies
- Implement monitoring and alerting for data quality issues across source systems and the warehouse
- Build and maintain a data dictionary that serves as the single source of truth for the organization
Orchestration of Analysts & Systems
- Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
- Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
- Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
- Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business
Qualifications:
Required
- 3-5 years of experience in data analytics, business intelligence, or data engineering
- 2+ years in a lead or management role overseeing analysts or data team members
- Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
- Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
- Proven track record establishing data governance standards and documentation practices
- Experience integrating and managing multiple data sources across SaaS and ERP platforms
- Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders
Preferred
- Experience in distribution, wholesale, or supply chain environments
- Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
- Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
- Background working within IT-led or hybrid IT/Analytics teams
Benefits:
- Health insurance
- Dental insurance
- Vision insurance
- Life & Disability insurance
- 401(K) with company match
Company Paid holidays and PTO:
- Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
- We offer 10 company paid holidays and 1 floating holiday per year.
Perks:
- Parking provided (Charlotte office and onsite at Hickory office)
- Employee Engagement team
- Monthly stipend to pursue an active lifestyle
Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.
Surescripts serves the nation through simpler, trusted health intelligence sharing, in order to increase patient safety, lower costs and ensure quality care. We deliver insights at critical points of care for better decisions - from streamlining prior authorizations to delivering comprehensive medication histories to facilitating messages between providers.
The Strategic Data(RWD) Acquisition Manager will be an integral part of Surescripts' data ecosystem by executing negotiations with Surescripts Network Alliance partners to secure data usage rights, while also identifying and acquiring new, strategic data sources. This person will play a critical role in maintaining access to high quality data necessary for the development of solutions that will deliver value and improve the experience for stakeholders across the healthcare ecosystem. This position requires a deep understanding of healthcare data, the regulatory landscape and business development experience to successfully negotiate and secure data agreements that will enhance our product portfolio.
Responsibilities:- Identify and evaluate potential data sources of interest that expand Surescripts' data portfolio. Create comprehensive value propositions for how the data could be used within Surescripts' solutions, and valuation of the data to make offers to data sources for data acquisition.
- Drive business development efforts to secure agreements that enhance Surescripts' data portfolio. With guidance from leadership, execute strategies to identify and approach potential data partners, and successfully negotiate terms.
- Collaborate with sales and product teams to develop strategies to align customer incentives with broader data-dependent initiatives. Interface with Surescripts Network Alliance partners to negotiate data usage rights, ensuring alignment with business goals and regulatory requirements.
- Interface with data providers, industry partners, and other stakeholders.
- Manage day-to-day data procurement-related inquiries and negotiations with data providers and customers.
- Maintain a thorough understanding of privacy laws, including HIPAA permitted purposes. Collaborate with compliance, privacy, security, and data governance teams to ensure all data procurement activities comply with all state and federal regulations, internal policies, and customer contracts.
- Monitor and report on data procurement activities. Track progress of data procurement efforts, report on key metrics, and provide regular updates to senior management. Proactively identify and address any challenges or obstacles in the procurement process. Monitor and evaluate the ROI of data acquisition initiatives to prioritize high-impact opportunities.
- Keep up-to-date with the latest developments in data rights, privacy regulations, and the healthcare industry. Apply and share this knowledge to improve data procurement strategies and ensure the company remains compliant and competitive.
Qualifications:
Basic Requirements:
- Bachelor's degree in Business, Economics, Data Science, or related field;
- 8+ years of experience in business development and/or related experience in the procurement/acquisition of healthcare data.
- Strong understanding of regulations around healthcare data, including Health Insurance Portability and Accountability Act (HIPAA) and Trusted Exchange Framework and Common Agreement (TEFCA).
- Ability to evaluate the value and quality of data assets and their applicability to business needs.
- Proven experience in negotiating contracts and managing vendor relationships.
- Demonstrated success in business development and deal negotiation.
- Excellent written and verbal communication and interpersonal skills.
- Ability to work independently and as part of a team.
- Ability to travel for team, customer and vendor meetings as needed.
- Strategic thinker with strong analytical and problem-solving abilities and results-driven mindset.
Preferred Qualifications:
- MBA or advanced degree preferred in a related field.
- Strong understanding of healthcare interoperability standards, such as Fast Healthcare Interoperability Resource (FHIR).
- Strong understanding of electronic health records (EHR), pharmacy and claims data, health information exchanges (HIE), and TEFCA qualified health information networks (QHINs)
- Familiarity with data governance tools (e.g. data mapping, lineage
#LI-remote
Surescripts embraces flexibility through its Flexible Hybrid Work model for most positions. This model allows employees to work virtually while still utilizing our offices as collaboration centers. With alignment and agreement from your leadership, you can come and go from the office as needed.
To be considered for employment, applicants must have a valid U.S. work authorization allowing work without restrictions with Surecripts in the U.S. At this time, we are unable to provide support or provide sponsorship for immigration benefits such as work visas. Additionally, we do not participate in academic training programs or work-study programs through an academic institution that require employer endorsement of F-1/CPT or F-1/STEM.
Why Wait? Apply Now
We're a midsize company. This means you're not just another employee ID number. Here, you can build real relationships and feel supported by truly awesome people with diverse backgrounds and talents in an innovative and collaborative work culture. We strive to create an environment where you can be yourself, share your ideas and work your way. We offer opportunities for employee development, as well as competitive compensation packages and extensive benefits.
Benefits include, but are not limited to, comprehensive healthcare (including infertility coverage), generous paid time off including paid childbirth and parental leave and mental health days, pet insurance, and 401(k) with company match and immediate vesting. To learn more, review the Keep You and Yours Healthy, Balancing Work and Life, and Where Talent Takes Shape links under the Better Benefits. Better Work. Better Life section of our careers site.
While performing duties of this job, an employee may be required to perform any, or all of the following: attend meetings in and out of the office, travel, communicate effectively (both orally and in writing), and be able to effectively use computers and other electronic and standard office equipment with, or without, a reasonable accommodation. Additionally, this job requires certain mental demands, including the ability to use judgement, withstand moderate amounts of stress and maintain attention to detail with, or without, a reasonable accommodation.
Surescripts is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, color, religion, age, national origin, ancestry, disability, medical condition, marital status, pregnancy, genetic information, gender, sexual orientation, parental status, gender identity, gender expression, veteran status, or any other status protected under federal, state, or local law.
Location: 100% Remote
Duration: 12+ Months
Overview:
An experienced Administrator to operate and support the enterprise implementation of Microsoft Purview Data Catalog across a complex, multi-platform data environment. The administrator will be responsible for the day-to-day configuration, monitoring, and maintenance of Purview capabilities, ensuring reliable metadata ingestion, catalog quality, lineage visibility, and compliance alignment across governed data domains.
This role focuses on platform operations and governance execution, working within established architecture and enterprise governance standards.
Key Responsibilities
Platform Administration & Operations:
- Administer and operate Microsoft Purview Data Map and Data Catalog environments.
- Monitor platform health, scan execution, metadata ingestion, and lineage availability.
- Troubleshoot and resolve catalog, scan, and connectivity issues.
- Perform routine maintenance, configuration updates, and service optimizations.
- Coordinate incident resolution with internal engineering teams and Microsoft support as required.
Data Source Management & Scanning:
- Register, configure, and maintain data sources across Azure, M365, on?prem, and approved third?party platforms.
- Configure and schedule metadata scans for supported sources.
- Manage authentication for scans using managed identities, service principals, and Key Vault secrets.
- Monitor scan performance, failures, and coverage; take corrective action as needed.
- Optimize scan frequency and scope to balance cost, performance, and governance coverage.
Catalog Configuration & Metadata Management:
- Maintain and enforce enterprise metadata standards within the Purview Catalog.
- Manage business metadata, classifications, glossary terms, and custom attributes.
- Ensure metadata accuracy, completeness, and consistency across data assets.
- Support curation activities including asset certification and publishing.
- Resolve duplicate, incomplete, or stale catalog entries.
Lineage & Discovery Enablement:
- Enable and validate data lineage ingestion from supported data platforms.
- Monitor lineage completeness and visibility for critical data assets.
- Assist data consumers and stewards with lineage?based impact analysis.
- Escalate lineage gaps or tool limitations requiring architectural or engineering remediation.
Security, Access & Governance Controls:
- Configure and manage Purview role?based access control (RBAC) within collections.
- Provision and maintain access for administrators, data curators, and data stewards.
- Enforce domain?based access controls and separation of duties.
- Integrate Purview access with Microsoft Entra ID.
- Support sensitivity labels and classification alignment with Microsoft Information Protection.
Compliance & Risk Support:
- Support automated discovery of sensitive data (PII, PCI, PHI).
- Assist risk, audit, and compliance teams with catalog evidence and reporting.
- Validate scan coverage for regulated data domains.
- Support regulatory and audit initiatives (SOX, GLBA, NYDFS, GDPR, etc.).
User Support & Enablement:
- Provide operational support to data producers, consumers, and data stewards.
- Respond to access requests, catalog issues, and usage questions.
- Maintain operational documentation, runbooks, and standard operating procedures.
- Support onboarding of new data domains following established governance patterns.
- Assist with training and adoption initiatives led by governance or architecture teams.
Required Qualifications:
- 5+ years experience supporting enterprise data platforms or governance tools and 4+ years hands?on MS Purview experience at enterprise scale.
- Hands?on experience administering Microsoft Purview Data Catalog.
- Strong understanding of metadata management, data classification, and lineage concepts.
- Working knowledge of Azure data services and enterprise data ecosystems.
- Experience managing access controls and identities using Microsoft Entra ID.
- Familiarity with regulated data environments and compliance requirements.
- Strong troubleshooting, operational support, and documentation skills.
Preferred Qualifications:
- Experience supporting Purview integrations with Synapse, Fabric, Databricks, Snowflake, or SQL Server.
- Exposure to financial services or other regulated industries.
- Experience with PowerShell, REST APIs, or basic automation for operational tasks.
- Prior experience supporting enterprise data governance or stewardship programs.
The Business Data Analyst will play a critical role in supporting data-driven decision-making for core PMA business functions. This position is focused on extracting valuable insights from complex datasets, creating operational reports, and developing intuitive BI dashboards tailored to business needs. Working within an enterprise reporting structure, the analyst will perform on-demand data discovery, conduct trend analysis, and develop analytics tools that empower stakeholders with meaningful insights. By ensuring data accuracy, quality and relevance, this role will support data governance activities and continuous process improvements that align with strategic objectives.
Responsibilities:
Data Analysis & Business Insights
* Conduct in-depth data analysis to support strategic business initiatives.
* Perform trend analysis and develop predictive insights to help business teams identify patterns, risks, and opportunities.
* Respond to data discovery requests and operational reports development to support key business metrics and decision-making.
* Deploy best practices and make recommendations for improved understanding.
* Translate complex data findings into actionable recommendations, presenting insights in a clear and meaningful way for non-technical stakeholders.
Enterprise Reporting & BI Dashboard Development
* Work closely with business stakeholders to understand their reporting needs, providing insights that drive data-informed decisions.
* Design, develop, and maintain interactive BI dashboards tailored to answering critical business questions, providing real-time access to critical metrics and performance insights.
* Utilize enterprise BI tools to create data visualizations that enable easy exploration of data and insights.
* Partner with stakeholders to test and refine dashboards, ensuring they align with business requirements and enhance decision-making capabilities.
* Facilitate training and support for business users on BI dashboards and reporting tools, enabling self-service access to data insights.
Data Quality Support & Validation
* Collaborate with data governance and data engineering teams to ensure high data quality and integrity in enterprise reports and dashboards.
* Perform data validation and verification as part of report development to ensure data accuracy, consistency, and relevance for business users.
* Monitor data accuracy metrics and support data issue resolution, maintaining a high standard of data quality across reporting tools.
* Demonstrate commitment to Company's Code of Business Conduct and Ethics, and apply knowledge of compliance policies and procedures, standards and laws applicable to job responsibilities in the performance of work.
Requirements:
* 3+ years of experience in data, analytics, or business intelligence.
* Bachelor's degree in Information Management, Data Science, Computer Science, Mathematics, Statistics, Economics, Psychology or a related field.
* Proficient in SQL for data extraction and manipulation across various data sources.
* Strong analytical skills to interpret complex datasets and draw actionable insights.
* Experience with BI platforms like QlikSense or Power BI for data visualization and dashboard development.
* Familiar with advanced Excel functions for data manipulation and reporting.
* Understanding of statistical methods and trend analysis for identifying patterns and creating projections.
* Familiar with predictive modeling or basic machine learning concepts is a plus.
* Proficiency with scripting languages or tools (such as Python, R, or VBA) for process automation is a plus.
* Basic understanding of data integration, ETL processes, and data warehousing concepts.
* Skilled in presenting data in a way that tells a compelling story and drives informed decision-making.
* Strong interpersonal skills to work effectively with cross-functional teams in underwriting, finance, and IT.
* High level of precision in data analysis, ensuring reports and insights are accurate and free of errors.
* Analytical mindset to investigate data challenges, identify root causes, and develop efficient solutions.
* Ability to adapt to evolving data requirements and troubleshoot issues with minimal supervision.
* Strong organizational skills to balance multiple projects and meet reporting deadlines.
* Effective time management to handle ad hoc requests and prioritize tasks in a fast-paced environment.
* Open and motivated to learn new tools, methods, and data practices.
Sr. Data Engineer (Hybrid)
Chicago, IL
The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.
At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.
We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.
We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.
As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.
RESPONSIBILITIES:
Data Engineering & AI Enablement
- Build and maintain scalable data pipelines and
ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
cases. - Implement best practice patterns for ingestion,
transformation, modeling, and orchestration within a modern lakehouse
environment (e.g., Databricks, Delta Lake, Azure Data Lake). - Develop highperformance
data models and curated datasets with strong attention to quality, usability,
and interoperability; create reusable engineering components and automation. - Collaborate with the Architecture Team, the Data
Platform Lead, and federated IT teams to optimize storage, compute, and
architectural patterns for performance and costefficiency. - Build model-ready data sets and feature
pipelines to support AI/ ML use cases; serve as a technical coordination point
supporting business units' AI-related infrastructure needs. - Collaborate with data scientists and AI Working
Group to operationalize models responsibly and maintain ongoing monitoring
signals.
Governance, Quality & Compliance
- Embed data governance, metadata standards,
lineage tracking, and quality controls directly into engineering workflows;
ensure technical implementation and alignment within engineering workflows. - Work with the Data Governance Lead and business
stakeholders to operationalize stewardship, classification, validation,
retention, and access standards. - Implement privacybydesign and securitybydesign
principles, ensuring compliance with internal policies and regulatory
obligations. - Maintain documentation for pipelines, datasets,
and transformations to support transparency and audit requirements.
Platform Reliability, Observability & Optimization
- Monitor and troubleshoot pipeline failures,
performance bottlenecks, data anomalies, and platformlevel issues. - Implement observability tooling, alerts,
logging, and dashboards to ensure endtoend reliability. - Support cost governance by optimizing compute
resources, refining job schedules, and advising on efficient architecture. - Collaborate with the Data Platform Lead on
scaling, configuration management, CI/CD pipelines, and environment management. - Collaborate with business units to understand
data needs, translate them into engineering requirements, and deliver
fit-for-purpose data solutions; share and apply best practices and emerging
technologies within assigned initiatives. - Work with IT Security and Legal/ Compliance to
ensure platform and datasets meet risk and regulatory standards.
Staff Management
- Lead, mentor, and provide management oversight
for staff. - Responsible for setting objectives, evaluating
employee performance, and fostering a collaborative team environment. - Responsible for developing staff knowledge and
skills to support career development.
May include other responsibilities as assigned
REQUIREMENTS:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
- 5+ years of experience in data engineering within cloud environments
- Experience in people management preferred.
- Demonstrated hands-on experience with modern data platforms (Databricks preferred).
- Proficiency in Python, SQL, and data
transformation frameworks. - Experience designing and operationalizing
ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
CI/CD processes. - Solid understanding of data modeling,
structured/unstructured data patterns, and schema design. - Experience implementing governance and quality
controls: metadata, lineage, validation, stewardship workflows. - Working knowledge of cloud architecture, IAM,
networking, and security best practices. - Demonstrated ability to collaborate across
technical and business teams. - Exposure to AI/ML engineering concepts, feature
stores, model monitoring, or MLOps patterns. - Experience with infrastructureascode
(Terraform, CloudFormation) or DevOps tooling.
The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.
This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.
We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.
THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION
Apply NowShare Save JobRemote working/work at home options are available for this role.