Explain Array In Data Structure Jobs in Usa

38,929 positions found — Page 3

SAP S/4HANA Functional Process Data Expert
Salary not disclosed
Atlanta 3 days ago
Summary: Location: Atlanta, GA Duration: 12 Months 100% Remote – open to any area Responsibilities: Partner with global and regional business stakeholders to define data requirements aligned to standardized value stream processes.

Translate business process designs into clear master and transactional data definitions for S/4HANA.

Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.

Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).

Define data objects, attributes, and mandatory fields.

Support business rules, validations, and derivations.

Align data structures to SAP best practices and industry standards.

Support data cleansing, enrichment, and harmonization activities.

Define and validate data mapping rules from legacy systems to S/4HANA.

Participate in mock conversions, data loads, and reconciliation activities.

Ensure data quality thresholds are met prior to cutover.

Support the establishment and enforcement of global data standards and policies.

Work closely with Master Data and Data Governance teams.

Help define roles, ownership, and stewardship models for value stream data.

Contribute to data quality monitoring and remediation processes.

Support functional and integrated testing with a strong focus on data accuracy.

Validate business scenarios using migrated and created data.

Support cutover planning and execution from a data perspective.

Provide post-go-live support and stabilization.

Requirements: 5 years of SAP functional experience with a strong data focus.

Hands-on experience with SAP S/4HANA (greenfield preferred).

Proven involvement in large-scale, global ERP implementations.

Deep understanding of value stream business processes and related data objects.

Experience supporting data migration, cleansing, and validation.

Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).

Understanding of S/4HANA data model changes vs.

ECC.

Experience working with SAP MDG or similar governance tools preferred.

Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).

Ability to read and interpret functional specs and data models.

Strong stakeholder management and communication skills.

Ability to work across global, cross-functional teams.

Detail-oriented with strong analytical and problem-solving skills.

Comfortable operating in a fast-paced transformation environment.

Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.

Prior role as Functional Data Lead or Data Domain Lead.

Experience defining global templates and harmonized data models.

Knowledge of data quality tools and metrics.

Experience with MGD and setting up cost center and profit center groups.
Not Specified
Data Integration & AI Engineer
✦ New
Salary not disclosed
Edison, NJ 1 day ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
Solution Architect - Microsoft Purview (Data Catalog & Governance)
Salary not disclosed
Newark, NJ 3 days ago
Title-: Solution Architect - Microsoft Purview (Data Catalog & Governance)

Duration-: 10+ Months

Location: Remote

Overview

An experienced Solution Architect to lead the enterprise rollout of Microsoft Purview across a complex global, multi cloud environment. The consultant will define architecture, implement domain?based governance, and drive adoption of Purview capabilities including cataloging, lineage, classification, access governance, and compliance controls.

Key Responsibilities


  • Architecture & Implementation
  • Define target?state architecture for Microsoft Purview across Azure, AWS, M365, on prem, and third party platforms.
  • Develop and drive the implementation roadmap across U.S. Businesses, PGIM, Corporate Technology, and international units.
  • Establish Purview reference architecture, integration patterns, and guardrails.
  • Domain Based Governance
  • Design collections, hierarchies, and RBAC aligned to domain structures and legal entity boundaries.
  • Enable domain owned stewardship while enforcing enterprise taxonomies and governance standards.
  • Platform Configuration
  • Configure Data Map, Catalog, Scans, Classifications, Sensitivity Labels, and Lineage.
  • Optimize scan strategy (frequency, cost, performance) and extend classifiers and metadata models.
  • Security & Compliance
  • Integrate Purview with M365 Information Protection, Entra ID, and security baselines.
  • Support PII/PCI/PHI detection, access governance, and regulatory compliance (SOX, GLBA, NYDFS, GDPR).
  • Engineering & Integration
  • Integrate with Synapse, Fabric, Databricks (including Unity Catalog), Snowflake, SQL Server, AWS sources, and SAP/Oracle.
  • Implement IaC (Bicep/Terraform), CI/CD for Purview artifacts, and automation via APIs.
  • Adoption & Stakeholder Management
  • Deliver training, onboarding playbooks, and steward enablement.
  • Lead workshops for new data domains and products.
  • Provide executive level reporting on progress, risks, and KPIs.


Required Qualifications


  • 10+ years in data architecture/governance; 2+ years hands on Purview experience at enterprise scale.
  • Strong expertise in metadata management, lineage, classification, scan optimization, glossary management and domain based operating models.
  • Solid Azure ecosystem knowledge (Storage, Key Vault, Synapse, Fabric, Databricks), M365 Information Protection, and Entra ID.
  • Experience with IaC (Bicep/Terraform), APIs/Atlas, and scripting (PowerShell/Python).
  • Financial services or regulated industry exposure.
  • Excellent communication, stakeholder leadership, and cross domain facilitation skills.


Not Specified
SENIOR AWS DATA ENGINEER
✦ New
Salary not disclosed
Irving, Texas 13 hours ago

Visa Status: US Citizen or Green Card Only

Location: Irving, TX (Local Candidates Only)

Employment Type: Full-time / Direct Hire

Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)

***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***

***US Citizen or Green Card Only***

The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.

Responsibilities:

  • Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
  • Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
  • Responsible for data cleansing, standardization, and quality control
  • Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
  • Create data models and ETL procedures with traceability, data lineage and source control
  • Design and implement data integration and data quality framework
  • Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
  • Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
  • Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
  • Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
  • Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
  • Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.

Target Skills and Experience

  • Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
  • At least 10 years' experience designing and managing data pipelines, schema modeling, and data processing systems.
  • Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
  • Experience with SAP a plus
  • Proficient in Python, with a track record of solving real-world data challenges.
  • Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
  • Experience with Terraform or other infrastructure-as-code tools is a plus.
Not Specified
Data Governance Manager
✦ New
Salary not disclosed
Dallas, Texas 13 hours ago

Must be local to TX

Skills:

Delivery manager

2026 road map

To deliver roadmap, interact with business, explain value prop, understand their rules, standard rules

Manage timelines

Partner with segments

Before and after Data Quality scores

Technical

Articulate technical design and solutions

Capabilities of Collibra, Soda

How to use those tools

Proactive communication skills

12+ years kind of role Technical Project Manager with solutioning and problem skills

Role Summary

The Data Governance Lead will design, build, and scale an enterprise data governance program from the ground up, using Collibra as the core platform for a large real estate enterprise. This senior role combines strategic leadership, hands‐on Collibra configuration, stakeholder management, and deep domain knowledge of real estate data. The incumbent will own the governance vision, operating model, and tooling, and will partner with business, IT, data engineering, analytics, legal, and compliance teams.

Key Responsibilities

1. Data Governance Strategy and Operating Model

  • Define and implement the enterprise data governance strategy, roadmap, and operating model aligned to business objectives.
  • Define governance KPIs, maturity metrics, and success measures.
  • Drive adoption through change management, communications, and training.

2. Collibra Implementation from Scratch

  • Lead end‐to‐end Collibra implementation: platform setup, environment planning (Dev/Test/Prod), domain modeling, and taxonomy design.
  • Customize asset models for real estate use cases.
  • Configure and manage Business Glossary, Data Dictionary, Data Catalog, and Reference Data & Code Sets.
  • Design and implement Collibra workflows for glossary lifecycle, owner/steward assignment, issue management, and escalation.
  • Implement Collibra operating model with defined roles (Data Owner, Data Steward, Custodian, Consumer) and RACI mappings.
  • Integrate Collibra with data warehouses/lakes (Snowflake, BigQuery, Azure), BI tools (Power BI, Tableau), and ETL/ELT tools (Informatica, dbt, ADF).
  • Lead metadata ingestion across technical, operational, and business metadata.

3. Data Ownership, Stewardship, and Accountability

  • Define and institutionalize data ownership and stewardship across business units.
  • Partner with business leaders to assign Data Owners and Stewards.
  • Drive accountability for data definitions, data quality, and metadata completeness.
  • Establish Data Governance Councils and working groups.

4. Data Quality and Issue Management

  • Collaborate with data quality teams to define Critical Data Elements (CDEs) and align rules and thresholds.
  • Configure Collibra issue management workflows and ensure traceability from issues to root causes and remediation actions.
  • Provide governance oversight for remediation and continuous improvement.

5. Compliance, Risk, and Security Governance

  • Define governance controls for regulatory compliance, contractual data, and financial reporting.
  • Partner with Legal, Risk, and Security to classify sensitive data and apply access and usage policies.
  • Implement data classification and privacy metadata within Collibra.

6. Stakeholder and Program Leadership

  • Serve as the single point of accountability for the data governance program.
  • Present progress, metrics, and risks to senior leadership.
  • Mentor governance analysts, stewards, and platform administrators.
  • Coordinate with system integrators and vendors as required.

Required Skills and Qualifications

Mandatory

  • 12–18+ years in data management, data governance, or analytics leadership.
  • Deep hands‐on experience implementing Collibra from scratch at enterprise scale.
  • Strong expertise in business glossary and metadata management, stewardship models, and workflow automation in Collibra.
  • Proven track record driving enterprise adoption of governance platforms.
  • Excellent stakeholder management and communication skills.

Preferred

  • Experience in real estate, property management, construction, facilities, or capital projects.
  • Familiarity with DAMA‐DMBOK, DCAM, or similar governance frameworks.
  • Exposure to data quality tools such as SODA, Great Expectations, or Informatica DQ.
  • Experience integrating Collibra with cloud data platforms.
  • Prior experience leading governance programs in large, federated organizations.
  • Collibra certification is a plus.

Behavioral and Leadership Attributes

  • Strategic thinker with strong execution capability.
  • Balances business pragmatism with governance rigor.
  • Influences without formal authority and drives change.
  • Excellent storytelling and change management skills.
  • Hands‐on leader who can configure Collibra and mentor teams.

Success Measures First 12 Months

  • Collibra platform live with core real estate domains onboarded.
  • Business glossary adopted across key business units.
  • Formal data ownership established for critical datasets.
  • Measurable improvement in metadata completeness and data quality visibility.
  • Governance operating model embedded into daily business processes.
Not Specified
Data Scientist
✦ New
Salary not disclosed
McLean, Virginia 13 hours ago
Hiring - Data Scientist
Location: Dallas TX or McLean VA
Cliff W2
Inperson interview
Onsite
  • 5+ years in data science, analytics, or cloud financial operations
  • Expertise in Python, SQL, and data science libraries (e.g., pandas, scikit-learn)
  • Strong statistical modeling and machine learning skills
  • Deep understanding of Azure and AWS cost structures and optimization levers
  • Excellent communication and stakeholder engagement skills
  • Experience with BI tools (Power BI, Tableau)
Please contact
Not Specified
Manufacturing Data & Sales Analyst
✦ New
🏢 LHH
Salary not disclosed
Addison, IL 1 day ago

LHH Recruitment Solutions has partnered with a growing organization, and they are seeking a motivated Manufacturing Data & Sales Analyst to join their team. Seeking a data-driven analytics professional who thrives at the intersection of manufacturing operations, business intelligence, and executive decision support. This is a high-impact role for someone who enjoys building insight from the ground up—designing dashboards, automating reporting, owning data integrity, and translating complex information into clear, actionable business outcomes.


Why This Role Stands Out:

  • High visibility and direct partnership with senior leadership.
  • Opportunity to own and evolve enterprise-level analytics and reporting.
  • Manufacturing environment where data truly drives strategy.
  • Long-term growth potential in a stable, well-capitalized organization.


Key Responsibilities:

Data, Analytics & Reporting:

  • Design, build, and continuously enhance dashboards, scorecards, and KPI reporting to support operational and commercial performance.
  • Translate raw data into meaningful insights that influence decision-making at the executive level.
  • Automate recurring reports and analytics processes to improve efficiency, accuracy, and scalability.
  • Analyze trends related to revenue, production performance, forecasting, and product initiatives.

Manufacturing & Cross-Functional Partnership:

  • Collaborate closely with Operations, Finance, IT, and Commercial teams to align data, metrics, and performance goals.
  • Support forecasting, planning cycles, and performance reviews with reliable, actionable analytics.
  • Identify risks, opportunities, and performance gaps within data sets and recommend solutions.

Systems & Data Ownership:

  • Act as the primary owner of manufacturing and sales-related data systems, ensuring usability, accuracy, and value.
  • Lead continuous improvement of reporting tools and system integrations.
  • Partner with internal and external stakeholders to enhance system reporting capabilities.
  • Champion data governance, consistency, and best practices across the organization.


Qualifications and Skills:

  • Bachelor’s Degree in Data Science, Analytics, Business Intelligence, or a related field
  • Proven experience building and maintaining dashboards, scorecards, and analytics tools.
  • Background supporting a manufacturing environment.
  • Strong ability to own data end-to-end—from extraction to interpretation to executive presentation.
  • Experience automating reporting and analytics processes.
  • Advanced analytical, problem-solving, and critical-thinking skills.
  • Ability to clearly communicate insights to both technical and non-technical audiences.
  • Advanced proficiency with Excel, reporting platforms, and Microsoft Office Suite.
  • Advanced proficiency in SQL, PowerBI, and/or Tableau.
  • Experience with IQMS is preferred.
  • Strategic mindset with exceptional attention to detail.


Compensation Range: $90,000 - $120,000 + 15% Bonus


Benefits Offered: 2 weeks of vacation, paid sick leave where applicable by state law, Medical Insurance, Dental Insurance Vision Insurance, 401K, and Life Insurance.


If you are a passionate Manufacturing Data & Sales Analystlooking for anew and rewarding career, please apply today! You don’t want to miss out on this opportunity!


LHH is a leader in permanent recruitment—and in the placement of top talent. Our areas of specialty include office administration, customer service, human resources, engineering, and supply chain and logistics. Please feel to check us out and apply for other opportunities if this role isn’t a perfect match.


Equal Opportunity Employer/Veterans/Disabled


To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit

Not Specified
Azure Data Engineer
✦ New
Salary not disclosed
Queens 1 day ago
Job Description : We are seeking a hands-on Consultant with strong Azure ETL experience and advanced Power BI development skills.

They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).

This is not an SSIS-only role.

The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.

This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.

The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.

This requires architectural thinking and hands-on technical execution.

Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.

Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.

Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.

6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.

6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.

6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.
Not Specified
Data Steward
✦ New
Salary not disclosed
Creve Coeur, MO 1 day ago

Job Summary:

Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.

Duties:

  • Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
  • Participate in data scraping, data curation and data compilation efforts
  • Ensure high quality of the data to end users
  • Ensure high quality of the inhouse data via data stewardship
  • Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
  • Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
  • Define data quality rules and implement automated monitoring, reporting, and remediation solutions
  • Coordinate intake and resolution of data support tickets
  • Support data migration from legacy systems, data inserts and updates not supported by applications
  • Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
  • Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
  • Create mapping documents and templates to improve existing manual processes
  • Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
  • Help answer questions from the end-users and coordinate with technical resources as needed
  • Build prototype SQL and continuously engage with end consumers with enhancements


Desired Skills/Experience:

  • Bachelor's Degree in Computer Science, Engineering, Science, or other related field
  • Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
  • Applied experience with querying SQL and/orNoSQL databases
  • Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
  • Data Warehousing experience
  • Strong written and verbal communication skills
  • Comfortable balancing demands across multiple projects / initiatives
  • Ability to identify gaps in requirements based on business subject matter domain expertise
  • Ability to deliver detailed technical documentation
  • Expert level experience in relevant business domain
  • Experience managing data within SAP
  • Experience managing data using APIs
  • Big Query experience

Benefits:

  • Medical, Dental, & Vision Insurance Plans
  • Employee-Owned Profit Sharing (ESOP)
  • 401K offered


The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.

At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.

By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at

Not Specified
Staff Software Engineer, Conversion Data Privacy
✦ New
Salary not disclosed
San Francisco, CA 1 day ago

About Pinterest:


Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.


Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.


At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.


Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.

Team & Mission


The Privacy & Conversion Data team is responsible for how the company safely and compliantly uses conversion data to power monetization. We build and operate the core privacy infrastructure behind ads reporting and optimization, including controlled data environments, finegrained access controls, centralized privacy rules enforcement, and deidentification pipelines for conversion data. Our mission is to make conversion data privacypreserving by default-centralized, deidentified, auditable, and easy for teams to use, while maintaining high utility for advertisers and staying ahead of an evolving global regulatory landscape.



Role Summary


We're seeking a Staff Engineer to lead the architecture and technical direction for the conversion data privacy platform, spanning both core Conversion Data systems and deidentification for ads reporting. You'll own the endtoend design and evolution of privacycritical pipelines and services, partner closely with Product, Data Science, Legal, and infrastructure teams, and set the technical bar for how we use conversion data safely at scale.



What you'll do:



  • Lead the technical strategy and architecture for conversion data privacy across access controls, deidentification, deletion, and privacy rules enforcement, driving toward a centralized, deidentifiedbydefault, automated privacy platform for monetization.
  • Design and evolve core privacy infrastructure including controlled environments for sensitive data, finegrained authorization and policy enforcement, and a central policy repository that consistently governs access across major data platforms and query engines.
  • Own deidentification pipelines for ads reporting endtoend-from separating sensitive and nonsensitive data, applying deidentification techniques and transformations, and generating privacypreserving datasets, to validating data utility and feeding reporting and analytics surfaces.
  • Build and improve privacy frameworks and tooling (for both online and offline workflows) that make safe, compliant conversion data usage simple and selfservice for downstream teams, reducing onboarding friction for new datasets, restrictions, and use cases.
  • Drive operational excellence and compliance by defining SLAs, building robust monitoring and alerting (e.g., deidentification quality, optout metrics, data leakages), leading incident response, and developing performant deletion and leakagehandling workflows that meet regulatory and audit requirements.
  • Partner crossfunctionally with ads, data, product, legal, and infrastructure stakeholders to translate legal/privacy requirements into technical designs, make clear tradeoffs between privacy and utility, and drive alignment on roadmaps, launches, and policy changes that impact advertisers and users.
  • Mentor and uplevel engineers across multiple teams, lead critical design and code reviews in privacysensitive areas, and establish best practices and documentation for privacybydesign, deidentification, and largescale data systems.


What we're looking for:



  • BS+ in Computer Science (or related field) or equivalent practical experience.
  • 8+ years of professional software engineering experience, with a focus on largescale data systems or distributed systems.
  • Strong proficiency building and operating data pipelines and services using Java/Scala/Kotlin or Python, plus SQL; experience with modern big data ecosystems is a plus.
  • Experience designing secure, reliable systems and APIs, with solid grounding in data modeling, access control, and performance optimization.
  • Meaningful experience in at least one of: privacypreserving data systems (e.g., deidentification, kanonymity), ads measurement/attribution, or largescale analytics/experimentation platforms.
  • Proven ability to drive crossteam technical initiatives from design through rollout, working closely with product, data science, and nonengineering partners (e.g., Legal, Compliance).
  • Strong communication and leadership skills, with a track record of mentoring engineers, raising engineering standards, and making sound decisions in ambiguous, highimpact problem spaces.


In-Office Requirement Statement:



  • We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.


Relocation Statement:



  • This position is not eligible for relocation assistance. Visit our PinFlex page to learn more about our working model.


#LI-REMOTE


#LI-KK6

At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.


Information regarding the culture at Pinterest and benefits available for this position can be found here.

US based applicants only$177,185—$364,795 USD

Our Commitment to Inclusion:


Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.

Not Specified
jobs by JobLookup
✓ All jobs loaded