Data Migration Vs Database Migration Jobs in Usa

9,270 positions found — Page 10

Staff Software Engineer, Conversion Data Privacy
Salary not disclosed
San Francisco, CA 2 days ago

About Pinterest:


Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.


Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.


At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.


Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.

Team & Mission


The Privacy & Conversion Data team is responsible for how the company safely and compliantly uses conversion data to power monetization. We build and operate the core privacy infrastructure behind ads reporting and optimization, including controlled data environments, finegrained access controls, centralized privacy rules enforcement, and deidentification pipelines for conversion data. Our mission is to make conversion data privacypreserving by default-centralized, deidentified, auditable, and easy for teams to use, while maintaining high utility for advertisers and staying ahead of an evolving global regulatory landscape.



Role Summary


We're seeking a Staff Engineer to lead the architecture and technical direction for the conversion data privacy platform, spanning both core Conversion Data systems and deidentification for ads reporting. You'll own the endtoend design and evolution of privacycritical pipelines and services, partner closely with Product, Data Science, Legal, and infrastructure teams, and set the technical bar for how we use conversion data safely at scale.



What you'll do:



  • Lead the technical strategy and architecture for conversion data privacy across access controls, deidentification, deletion, and privacy rules enforcement, driving toward a centralized, deidentifiedbydefault, automated privacy platform for monetization.
  • Design and evolve core privacy infrastructure including controlled environments for sensitive data, finegrained authorization and policy enforcement, and a central policy repository that consistently governs access across major data platforms and query engines.
  • Own deidentification pipelines for ads reporting endtoend-from separating sensitive and nonsensitive data, applying deidentification techniques and transformations, and generating privacypreserving datasets, to validating data utility and feeding reporting and analytics surfaces.
  • Build and improve privacy frameworks and tooling (for both online and offline workflows) that make safe, compliant conversion data usage simple and selfservice for downstream teams, reducing onboarding friction for new datasets, restrictions, and use cases.
  • Drive operational excellence and compliance by defining SLAs, building robust monitoring and alerting (e.g., deidentification quality, optout metrics, data leakages), leading incident response, and developing performant deletion and leakagehandling workflows that meet regulatory and audit requirements.
  • Partner crossfunctionally with ads, data, product, legal, and infrastructure stakeholders to translate legal/privacy requirements into technical designs, make clear tradeoffs between privacy and utility, and drive alignment on roadmaps, launches, and policy changes that impact advertisers and users.
  • Mentor and uplevel engineers across multiple teams, lead critical design and code reviews in privacysensitive areas, and establish best practices and documentation for privacybydesign, deidentification, and largescale data systems.


What we're looking for:



  • BS+ in Computer Science (or related field) or equivalent practical experience.
  • 8+ years of professional software engineering experience, with a focus on largescale data systems or distributed systems.
  • Strong proficiency building and operating data pipelines and services using Java/Scala/Kotlin or Python, plus SQL; experience with modern big data ecosystems is a plus.
  • Experience designing secure, reliable systems and APIs, with solid grounding in data modeling, access control, and performance optimization.
  • Meaningful experience in at least one of: privacypreserving data systems (e.g., deidentification, kanonymity), ads measurement/attribution, or largescale analytics/experimentation platforms.
  • Proven ability to drive crossteam technical initiatives from design through rollout, working closely with product, data science, and nonengineering partners (e.g., Legal, Compliance).
  • Strong communication and leadership skills, with a track record of mentoring engineers, raising engineering standards, and making sound decisions in ambiguous, highimpact problem spaces.


In-Office Requirement Statement:



  • We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.


Relocation Statement:



  • This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.


#LI-REMOTE


#LI-KK6

At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.


Information regarding the culture at Pinterest and benefits available for this position can be found here.

US based applicants only$177,185—$364,795 USD

Our Commitment to Inclusion:


Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.

Not Specified
Data Governance Program Manager
✦ New
Salary not disclosed
Lincoln, NE 10 hours ago

Summary



GENERAL SUMMARY:



Responsible for the establishment and maintenance of Bryan Healths data governance program and infrastructure. As the organizations data governance champion, collaborates with leaders and data stewards across Bryan Health to develop, implement, and execute the organizations data and AI governance strategies, policies, and procedures.



As a critical part of Bryans Data Analytics Center of Excellence, recommends data governance policies to entity governance for review and approval, while working with IT and key business units to constantly improve data definitions, integrity, security, and reliability, ensuring high-quality inputs for analytics and AI models.



PRINCIPAL JOB FUNCTIONS:



1. *Commits to the mission, vision, beliefs and consistently demonstrates our core values.



2. *Participates in or supports work stream planning process.



3. *Effectively communicates with executive sponsors, project advocates, leaders, and data stewards to help them understand and apply policy and principles of data governance while ensuring that deliverables meet business requirements.



4. *Develops and plays a hands-on role in operationalizing an organization-wide data governance strategy, framework and roadmap that aligns with the organizations overall data, analytics, AI, and data security strategy.



5. *Defines and enforces data standards, data classification, and data protection guidelines to ensure consistent and reliable data across systems, processes, and business units.



6. Translates Data and AI governance policies and standards into actionable strategies and implementable solutions, ensuring practical application and demonstratable impact on data reliability.



7. *Develops and implements policies and standards for managing data used in AI models to ensure quality, security, privacy, and compliance with relevant regulations.



8. *Collaborates with business stakeholders, data owners and data stewards to establish data governance roles, responsibilities, and accountabilities within the organization.



9. *Collaborates with data science and AI/ML teams to ensure the reliability of data used in AI training, validation, and deployment.



10. Aligns data governance policies with AI development lifecycle, ensuring proper data stewardship and governance throughout the AI project lifecycle.



11. *Conducts assessments to identify data issues, gaps, and opportunities for improvement.



12. *Provides guidance and training to business users and data stewards on data governance policies, procedures, and best practices.



13. *Collaborates with Data Council, Advisory Teams, and other governance groups for program oversight and issue resolution.



14. Identifies and mitigates risks related to AI data, including bias, fairness, and privacy concerns, to support responsible AI practices.



15. Works closely with Data Analytics and IT to design and implement data governance tools, technologies, and platforms to provide data quality checks, data cataloging, and data lineage tracking.



16. Stays up to date with AI and data governance industry trends, emerging technologies, regulatory changes, and standards around the evolving legal, ethical, and technological standards related to AI and data governance, and proactively recommends improvements and enhancements to governance frameworks.



17. Ensures that data used in AI systems complies with data privacy laws and organizational policies and maintains auditability of AI data pipelines.



18. Engages and advises the Bryan Data Analytics Council on project prioritization and other agenda items as needed.



19. Develops effective collaborative relationships with stakeholders across the Bryan Health System.



20. Works with executive sponsors and project advocates to ensure products meet business requirements.



21. Collaborates with other teams and leaders to ensure resources and priorities align with Data Council guidance.



22. Establishes effective relationships with clients and provides leadership for all data governance at Bryan.



23. Maintains professional growth and development through seminars, workshops, and professional affiliations to keep abreast of latest the latest trends in field of expertise. Keeps abreast of industry news and trends.



24. Effectively facilitates and participates on multi-disciplinary teams; attends and participates in project meetings and activities.



25. Performs other related projects and duties as assigned.



(Essential Job functions are marked with an asterisk *).



REQUIRED KNOWLEDGE, SKILLS AND ABILITIES:



1. Expert knowledge in principals of data governance and data governance program design.



2. Highly proficient in data governance concepts and application (metadata management, data quality, stewardship, etc.).



3. Knowledge of AI and AI governance.



4. Knowledge of health care market and industry trends.



5. Knowledge of computer hardware equipment and software applications relevant to work functions.



6. Strong skills in problem solving and process improvement.



7. Excellent communication skills and ability to explain complex topics to non-technical audiences.



8. Strong ability in program and project management.



9. Ability to perform crucial conversations with desired outcomes.



10. Ability to communicate effectively both verbally and in writing.



11. Ability to establish and maintain effective working relationships with all levels of personnel and medical staff.



12. Ability to effectively interact with clients that have a broad range of computer knowledge and ability.



13. Ability to plan for and act on changes in the business and market environment that impact current business plans and processes.



14. Ability to problem solve and engage independent critical thinking skills.



15. Ability to prioritize work demands and work with minimal supervision.



16. Ability to maintain confidentiality relevant to sensitive information.



17. Ability to maintain regular and punctual attendance.



EDUCATION AND EXPERIENCE:



Bachelors degree in Data Science, Data Management, Analytics, Computer Science, Public Health, Hospital Administration, Business, or related field required. Masters degree preferred. Five (5) years of related work experience required. Prior project leadership experience (formal or informal) required. Prior data governance experience highly preferred. Prior Epic experience preferred.



OTHER CREDENTIALS / CERTIFICATIONS:



Epic Cogito Fundamentals and Cogito Project Manager certification preferred. Epic certification in Cogito Fundamentals and Cogito Project Manager required within six (6) months of hire.



PHYSICAL REQUIREMENTS:



(Physical Requirements are based on federal criteria and assigned by Human Resources upon review of the Principal Job Functions.)



(DOT) Characterized as sedentary work requiring exertion of up to 10 pounds of force occasionally and/or a negligible amount of force frequently to lift, carry, push, pull, or otherwise move objects, including the human body.

Not Specified
Databricks Architect/ Senior Data Engineer
🏢 OZ
Salary not disclosed
Boca Raton, FL 2 days ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Data Analyst Manager
✦ New
Salary not disclosed
Hickory, NC 1 day ago

Who We Are

At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.

What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.

Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?


Role Summary:

The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.


Responsibilities:

Data Architecture & Tooling

  • Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
  • Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
  • Ensure alignment between data infrastructure and evolving business needs across distribution operations
  • Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite

Data Governance & Quality

  • Define and enforce data standards, metric definitions, and naming conventions across all business domains
  • Establish data ownership, lineage documentation, and access governance policies
  • Implement monitoring and alerting for data quality issues across source systems and the warehouse
  • Build and maintain a data dictionary that serves as the single source of truth for the organization

Orchestration of Analysts & Systems

  • Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
  • Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
  • Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
  • Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business


Qualifications:

Required

  • 3-5 years of experience in data analytics, business intelligence, or data engineering
  • 2+ years in a lead or management role overseeing analysts or data team members
  • Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
  • Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
  • Proven track record establishing data governance standards and documentation practices
  • Experience integrating and managing multiple data sources across SaaS and ERP platforms
  • Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders

Preferred

  • Experience in distribution, wholesale, or supply chain environments
  • Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
  • Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
  • Background working within IT-led or hybrid IT/Analytics teams


Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life & Disability insurance
  • 401(K) with company match


Company Paid holidays and PTO:

  • Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
  • We offer 10 company paid holidays and 1 floating holiday per year.


Perks:

  • Parking provided (Charlotte office and onsite at Hickory office)
  • Employee Engagement team
  • Monthly stipend to pursue an active lifestyle


Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.

Not Specified
Data Quality Analyst
✦ New
Salary not disclosed
Juno Beach, FL 4 hours ago

Job Description


The Data Quality Analyst / Databricks Implementation Specialist plays a key role in advancing the company’s enterprise data governance and Databricks Lakehouse strategy. This role partners closely with business data stewards, data owners, and technical teams to translate business data requirements into governed, high-quality datasets within Databricks Unity Catalog. The analyst will support domain onboarding, develop and operationalize data quality rules, perform profiling and analysis, and help implement enterprise standards for metadata, lineage, and semantic consistency.


Key Responsibilities


  • Data Quality & Profiling
  • Develop, document, and maintain data quality rules for critical data elements (CDEs).
  • Perform data profiling, anomaly detection, and root-cause analysis.
  • Partner with data stewards to validate definitions, thresholds, and business rules.
  • Monitor and report on data quality metrics and remediation progress.
  • Databricks Unity Catalog Implementation
  • Support Unity Catalog rollout across domains, including catalog structure, tagging, and metadata standards.
  • Assist with onboarding domains into the Bronze → Silver → Gold architecture.
  • Ensure lineage, ownership, and quality rules are embedded into Databricks pipelines.
  • Help implement domain-aligned access controls and sensitivity tagging.
  • Collaboration with Data Stewards & Business Partners
  • Work directly with business data stewards to understand data requirements and quality expectations.
  • Translate business meaning into standardized CDEs and steward-approved metadata.
  • Facilitate working sessions to align on semantics, domain boundaries, and data product requirements.
  • Support consistent governance practices across domains.
  • Metadata, Lineage, and Catalog Management
  • Maintain high-quality metadata in the enterprise data catalog.
  • Ensure CDEs, KPIs, and domain terms are accurately documented.
  • Validate lineage from raw sources through refined layers.
  • Data Analysis & Issue Resolution
  • Investigate data issues raised by business users or downstream consumers.
  • Perform impact analysis for schema changes or quality rule updates.
  • Support remediation efforts with engineering and business teams.


Required Skills & Experience


3–5 years of experience in data quality, data governance, or data analysis.

Hands-on experience with Databricks, Delta Lake, or similar cloud platforms.

Strong understanding of data quality concepts.

Experience with metadata catalogs or governance tools.

Proficiency with SQL and data analysis.

Strong communication skills.


Nice to Have Skills & Experience


Experience with Databricks Unity Catalog.

Familiarity with Medallion Architecture.

Exposure to governance frameworks (DAMA, DCAM).

Experience collaborating with data stewards or data owners.

Knowledge of data modeling or semantic layers.


Pay Rate depending on background and experience ranging from $35-43/hr

Not Specified
MDM Data Quality & Cleansing Specialist
✦ New
Salary not disclosed
Wayne, PA 4 hours ago

Job Name: MDM Data Quality & Cleansing Specialist

Job Location: Wayne, PA, 19087 (2 days/week onsite is required - Team onsite day is Thursdays)

Duration: 6 Months with potential to extend

Working Hours: 8:30 am - 5:30 pm (some flexibility)

Interview Process: 1 45-minute virtual interview


Position Summary


The MDM Data Quality & Cleansing Specialist is responsible for supporting enterprise Master Data Management (MDM) initiatives by performing remediation of post–match merge fallout records and executing data cleansing activities across designated data domains. This position plays a critical role in ensuring the accuracy, consistency, and completeness of master data in accordance with established data governance policies, data quality standards, and operational procedures.


Responsibilities


  • MDM Fallout Management
  • Review and research fallout records generated from MDM match merge processes.
  • Perform timely and accurate remediation of data exceptions in accordance with predefined business rules and governance standards.
  • Validate survivorship outcomes and ensure that entity resolution results align with data stewardship expectations.
  • Conduct root cause analysis to determine factors contributing to recurring data exceptions.
  • Data Cleansing and Data Quality Support
  • Execute data cleansing tasks including standardization, deduplication, formatting corrections, and attribute validation.
  • Verify data completeness and accuracy using approved tools, templates, and quality checks.
  • Perform bulk updates or corrections as authorized, following established protocols and change control requirements.
  • Assist in monitoring data quality dashboards, reports, and exception queues.
  • Data Stewardship Collaboration
  • Collaborate with Data Governance, Data Stewards, business partners, and MDM Operations teams to resolve data issues requiring business input.
  • Document remediation decisions and maintain required audit trails in accordance with compliance and governance standards.
  • Support stewardship processes by escalating complex or policy related issues as appropriate.


Qualifications

Required


  • Minimum of 2 years of experience in Master Data Management, Data Governance, Data Quality, or a related data operations role.
  • Proficiency with Microsoft Excel (e.g., lookup functions, pivot tables, filtering, data cleaning techniques).
  • Experience working with one or more MDM applications (e.g., Informatica or similar).


Preferred


  • Experience with match merge or entity resolution workflows.
  • Basic proficiency in SQL or other data manipulation/query tools.
  • Familiarity with data governance frameworks, data quality rules, and metadata management principles.
  • Prior experience working with party (customer, partner) master data.
Not Specified
Data Steward Senior Analyst (Record Retention & Deletion policy and processes )
Salary not disclosed
Phoenix, AZ 3 days ago

As a Data Steward Senior Analyst, you are part of a team responsible for enabling and supporting compliance with data-related enterprise policies within their domains/business units. You and your team are responsible for identifying critical data and associated risks, maintaining data definitions, classifying data, supporting data sourcing / usage requests, measuring Data Risk Controls, and confirming Data Issues are remediated. You have the opportunity to partner across various business units, technology teams, and product/platform teams to define and implement the data governance strategy, supervising and leading data quality, resolving data/platform issues, and driving consistency, usability, and governance of specific product data across the enterprise.


In addition, this role will play a key part in effectively communicating new and updated data-related policies to the teams responsible for compliance. The individual must be skilled in preparing clear, engaging presentations that translate formal policy language into practical, easy-to-understand guidance and “tell the story” behind the policy requirements. The role will also support the delivery of training sessions, facilitate policy office hours, and serve as a go-to resource for questions related to data governance and retention compliance.


Your Primary Responsibilities may include:

• Assist in identifying data-related risks and associated controls for key business processes. Risks relate to Record Retention (primary), Data Quality, Data Movement, Data Stewardship, Data Protection, Data Sharing, among others.

• Develop training materials and educate organization on Record Retention and Deletion processes and procedures.

• Develop deep understanding of key enterprise data-related policies and serve as the policy expert for the business unit, providing education to teams regarding policy implications for business.

• Collaborate with and influence product managers to ensure all new use cases are managed according to policies.

• Influence and contribute to strategic improvements to data assessment processes and analytical tools.

• Support current regulatory reporting needs via existing platforms, working with upstream data providers, downstream business partners, as well as technology teams.

• Subject matter expertise on multiple platforms.

• Responsible to partner with the Data Steward Manager in developing and managing the data compliance roadmap.


Qualifications include:

• 5 + years of experience in a similar role involved with ensuring compliance with Record Retention and Deletion policies.

• Strong communication skills and ability to influence and engage at multiple levels and cross functionally.

• Intermediate understanding of Data Management and Data Governance concepts (metadata, lineage, data quality, etc.) and prior experience.

• 5+ years of Data Quality Management experience.

• Strong familiarity with data architecture and/or data modeling concepts

• 5+ years of experience with Agile or SAFe project methodologies

• Bachelor’s degree in Finance, Engineering, Mathematics, Statistics, Computer Science or other similar fields.

• Preferred: Experience in Travel Industry.

• Preferred: Knowledge of RCSA (Risk Control Self-Assessment) methodology


Leadership Skills may include:

• Makes Decisions Quickly and Effectively: Drives effective outcome through decision making authority. Displays judgement and discretion in order to ensure deliverables are sufficient to the American Express policy and overall compliance.

• Drives Innovation & Change: Provides systematic and rational analysis to identify the root cause of problems. Is prepared to challenge the status quo and drive innovation. Makes informed judgments, recommends tailored solutions.

• Leverages Team - Collaboration: Coordinates efforts within and across teams to deliver goals, accountable to bring in ideas, information, suggestions, and expertise from others outside & inside the immediate team.

• Communication: Influences and holds others accountable and has ability to convince others. Identifies the specific data governance requirements and is able to communicate clearly and in a compelling way.

Not Specified
Data Entry 1
Salary not disclosed
Atlanta 2 days ago
Immediate need for a talented Data Entry 1 .

This is a 03 months contract opportunity with long-term potential and is located in U.S(Remote).

Please review the job description below and contact me ASAP if you are interested.

Job ID:26-08963 Pay Range: $22
- $23/hour.

Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).

Key Responsibilities: Submit a minimum of 8 submission per 8-hour shift.

This can change as we are going to process changes.

Review request, research, and submit changes per regulation/business rules.

The main function of a data entry specialist is to operate data entry devices, such as a keyboard or computer, to verify and input data.

A typical data entry specialist is responsible for accurate information documentation and personal project management.

Read source documents such as practitioner profiles, emails, and enter data in specific data fields or onto tapes or disks for subsequent entry, using keyboards or scanners.

Compile, sort and verify the accuracy of data before it is entered.

Locate and correct data entry errors or report them to supervisors.

Compare data with source documents, or re-enter data in verification format to detect errors.

Maintain logs of activities and completed work.

Key Requirements and Technology Experience: Key Skills;Technical skills include documentation skills and time management.

Health plan experience, data entry experience, Previous experience with computer applications, such as Microsoft Word and Excel.

3-5 years of data entry experience is required.

A High School Diploma or GED is required.

Our client is a leading Healthcare Industry, and we are currently interviewing to fill this and other similar contract positions.

If you are interested in this position, please apply online for immediate consideration.

Pyramid Consulting, Inc.

provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, colour, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc.

and its affiliates, and contracted partners.

Frequency varies for text messages.

Message and data rates may apply.

Carriers are not liable for delayed or undelivered messages.

You can reply STOP to cancel and HELP for help.

You can access our privacy policy here .
Not Specified
Data Product Engineer
Salary not disclosed
Newark, NJ 3 days ago
Job Title: Marketplace Data Product Engineer

Duration: 6+ months

Location: 100% Remote

Job Overview

The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.

Key Responsibilities


  • Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
  • Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
  • Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
  • Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
  • Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
  • Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
  • Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
  • Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
  • Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
  • Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.


Required Skills & Qualifications


  • 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
  • Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
  • Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
  • Excellent communication and presentation skills, with experience translating technical concepts for business partners.
  • Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
  • Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
  • Strong collaboration and stakeholder management skills across engineering, product, and business teams.
  • Comfortable working in fast?moving environments and driving clarity through ambiguity.


Preferred Qualifications


  • Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
  • Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
  • Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
  • Background in solution architecture, customer engineering, or sales engineering.
  • Experience developing demo environments, sample apps, or repeatable platform enablement assets.
  • Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.


Not Specified
Sr. Data Engineer, tvScientific
🏢 Pinterest
Salary not disclosed
San Francisco, CA 3 days ago

About Pinterest:


Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.


Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.


At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.


Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.

About tvScientific


tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.



As a Senior Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats. This is an individual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.



What you'll do:



  • Implement robust data infrastructure in AWS, using Spark with Scala
  • Evolve our core data pipelines to efficiently scale for our massive growth
  • Store data in optimal engines and formats
  • Collaborate with our cross-functional teams to design data solutions that meet business needs
  • Built out fault-tolerant batch and streaming pipelines
  • Leverage and optimize AWS resources while designing for scale
  • Collaborate closely with our Data Science and Product teams
  • How we'll define success:

    • Successful implementation of scalable and efficient data infrastructure
    • Timely delivery and optimization of data assets and APIs
    • High attention to detail in implementation of automated data quality checks
    • Effective collaboration with cross-functional teams




What we're looking for:



  • Production data engineering experience
  • Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
  • Familiarity with data lakes, cloud warehouses, and storage formats
  • Strong proficiency in AWS services
  • Expertise in SQL for data manipulation and extraction
  • Excellent written and verbal communication skills
  • Bachelor's degree in Computer Science or a related field
  • Nice-to-Haves

    • Experience in adtech
    • Experience implementing data governance practices, including data quality, metadata management, and access controls
    • Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
    • Familiarity with data table formats like Apache Iceberg, Delta




In-Office Requirement Statement:



  • We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.


Relocation Statement:



  • This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.


#LI-SM4


#LI-REMOTE

At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.


Information regarding the culture at Pinterest and benefits available for this position can be found here.

US based applicants only$123,696—$254,667 USD

Our Commitment to Inclusion:


Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.

Not Specified
jobs by JobLookup
✓ All jobs loaded