Stack Using Array Definition In Data Structure Jobs Salary Jobs in Usa
39,223 positions found — Page 2
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
You've done a ton of Leetcode.
You've racked up certificates, aced LeetCode challenges, and you know your way around system design like the back of your hand.
On paper, you're everything a tech company wants.
However tech stacks and requirement change every day.
Since 2010, we've helped thousands of candidates land full-time jobs at tech leaders like Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Paypal, Banking, Wayfair, Client, Client and hundreds more with Job offers of $95k to $154k.
Synergisticit focuses on closing the gap between your tech skills and what employers want now.
Open Roles We're Hiring For our clients: Entry-Level Software Programmers (Java/Python) Java Full Stack Developers Data Analysts & BI Engineers Data Scientists & ML Engineers All visa types and U.S.
citizens are encouraged to apply.
Note: Internships, freelance, or personal projects will not be considered toward experience requirements.
If you submit your resume, please be advised it may be entered into a central database shared by our JOPP team (our placement program).
You may unsubscribe if you receive emails.
Check the links below: Please check the below links: SynergisticIT USA Today Article Videos of Synergisticit At OCW, JAVAONE, GARTNER SUMMIT We Focus on Java /Full stack/Devops and Data Science /Data Engineers/Data analysts/BI Analysts/ Machine learning/AI candidates Ideal Candidates: Recent grads in CS, Engineering, Math, or Statistics with limited or no job experience Jobseekers who had layoffs due to Downsizing and want to get in demand tech stack Professionals seeking a career switch to tech Candidates with career gaps or lacking real-world experience Individuals looking to boost their skill portfolio for better job prospects Computer Science grads with limited or no job experience Students who recently finished their Bachelor's or Master's programs Those struggling to land interviews despite having experience Candidates on F1/OPT needing a job for STEM extension or H-1B filing Currently, We are looking for entry-level software programmers, Java Full stack developers, Python/Java developers, Data analysts/Data Engineers/ Data Scientists, Machine Learning engineers for full time positions with clients.
Top tech companies are flooded with smart grads.
What gets you in the door now is real-world application, confidence in delivery, and the soft skills to own a room—or a Zoom.
please check the below links Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT Technical Skills or Experience? | Which one is important to get a Job? | SynergisticIT Backend vs.
Full Stack Development: Job Prospects | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Is AI Going to Replace Software Programmers? | SynergisticIT The Market's Changed—Have You? Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume.
Sr Data & BI Engineer (Hybrid)
We're partnering with a growing organization seeking a SQL-focused Data & BI Engineer to build and optimize data pipelines, support ETL processes, and drive reporting infrastructure. This role sits at the intersection of data engineering and business intelligence, with strong visibility across teams and leadership.
What You'll Do
- Design, build, and maintain SQL-based data pipelines and transformations
- Develop and optimize ETL processes to support reporting and analytics
- Write performant SQL for data modeling, transformation, and downstream consumption
- Support and enhance reporting infrastructure (SSRS → Power BI migration)
- Partner with business and technical teams to deliver scalable data solutions
- Improve data quality, structure, and accessibility across systems
- Contribute to performance tuning and optimization of data workflows
What You Bring
- Strong SQL skills with experience in data transformation and pipeline development
- Experience with ETL tools or frameworks (SSIS or similar)
- Exposure to BI tools such as Power BI or SSRS
- Experience working with structured data models in a production environment
- Ability to operate across both data engineering and reporting use cases
Environment
- Hybrid: 3 days onsite
- Evolving data environment with active investment in modernization
- Transitioning reporting stack from SSRS to Power BI
- Collaborative team with dedicated DBA support
Compensation
$120K – $140K base + bonus potential and good benefits
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Sr. Full Stack Engineer
Job ID
2025-2140
# of Openings
1
Overview
Currently seeking multiple Full Stack Developers in support of the of U.S. Citizenship and Immigration Services (USCIS) Engineering Support for Identity Services (ESIS), this individual will support Agile Application Development technologies and capabilities in the areas of software development, systems engineering, integration, and test of software applications and infrastructure. Will be skilled with front-end, back-end, and database development. Design and implement full stack cloud solutions to include IaaS, PaaS, and SaaS. Design and deploy computing infrastructure, physical or virtual machines and other resources like virtual-machine disk image library, block and file-based storage, firewalls, load balancers, IP addresses, virtual local area networks. Implement cloud-based platform services for AWS. Implement cloud-based software as service for AWS. Perform DevOps functions.
Key Skills:
- 10+ years of experience with full stack engineering with proficiency in database development/integration as well as server and client application development/integration
- Software developing experience using Python and Java Spring framework
- Experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- AWS experience required with experience deploying enterprise applications in AWS
- Experience with CI/CD environment tools such as Docker, Jenkins, Ansible, Kubernetes
Responsibilities
- Software development with Python, Java, React, and various scripting languages
- Design data models and web APIs and creation of software tasks from system requirements
- Perform requirements analysis, design, development, unit, and integration testing of software, troubleshooting and debugging of the system
- Immediate responsibilities will include enhancing and maintaining the existing system as well as design, development, and documentation of new features
- Create Git Releases, pull request and code reviews
- Query logs utilizing Splunk and will monitor dashboarding utilizing New Relic
- Usage of Atlassian Tools for day to day tasks within the Scrum process
- Implement web services, data persistence access features and external interfaces
- Partner closely with front-end and database engineers to ensure features are developed holistically
- Follow Agile software development methodology and team architecture standards.
- Will need to be able to read Architecture Diagrams
- Perform test service to improve code coverage, mocking services, test driven development and unit testing
- Will modify Helm Charts, Jenkinsfiles, and Dockerfiles
Qualifications
- MUST BE US CITIZEN
- Bachelor's degree required
- Must be able to obtain and maintain a Public Trust security clearance
- 10+ years expereince in Software Engineering
- Must have experience in Python and Java Spring Framework (Boot, Batch, Data, Security)
- Must have experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- Experience with design, development, enhancement, troubleshooting and debugging of web applications
- Must have experience in AWS cloud environment and with CI/CD tools (ie. Docker, Jenkins, Kubernetes) for deployment processes, monitoring production environments, and modifying docker/Jenkins files and helm charts
- Experience with scripting languages (Python, Bash, Powershell, Perl) is not required but nice to have
- Understanding of the concept of branching and utilizing technological tools such as Git, VS Code, and/or Rancher to perform
- Experience with creating Git releases, creating pull requests, and reviewing code
- Experience monitoring dashboards utilizing New Relic
- Experience with Splunk to query logs
- Experience with Junit testing preferred
- Experience creating release instructions utilizing JIRA
- Experience developing and integrating complex software systems through the full SDLC
- Experience with Agile Scrum
- Must have strong written and verbal communication skills
Target Pay Range
The below listed pay range for this position is not a guarantee of compensation or salary. The final offered salary will be influenced by a host of factors including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at Pyramid Systems that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits, to include our Employee Stock Ownership Program, FlexPTO, and learning and development opportunities.
Pyramid Min
USD $125,731.00/Yr.
Pyramid Max
USD $188,597.00/Yr.
Why Pyramid?
Pyramid Systems, Inc. is an award-winning, technology leader, driving digital transformation across federal agencies. We empower forward-thinking innovations, accelerate production-ready software, and deliver secure solutions so federal agencies can meet their mission goals. Voted a Top Workplace, both regionally (Washington, DC) and Nationally (USA) the past 2 years (2023 and 2024) based on the feedback from our employees, we are headquartered in Fairfax, VA. and have a growing national footprint. We value and promote our Flexible Workplace approach because of the positive impacts it has on work-life integration. We remain committed to ensuring every employee's voice is heard, performance and results are recognized and rewarded, development and advancement is a focus, and diversity, equity and inclusion is a company priority. We offer competitive compensation and benefits (including a recently launched Employee Stock Ownership Plan - ESOP), a robust performance-based rewards program, and we know how to have fun! Our people and culture have endured and delivered for our clients for nearly three decades.
EEO Statement
Pyramid Systems, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
This is a full-time position that requires onsite presence in Des Moines, Iowa. Candidates must be authorized to work in the United States without sponsorship now or in the future.
P3+Uplift is partnering with a local insurance company to find a SQL-driven Data Analyst who enjoys working directly with business stakeholders to turn data questions into clear insights and reporting. This role is highly hands-on with SQL and data extraction, working across multiple data sources to support reporting, analysis, and data-driven decision making. The ideal candidate is both analytical and consultative—able to understand business needs, write efficient queries, and deliver clear, actionable insights.
The company offers a flexible schedule, hybrid work environment, casual dress code, and a collaborative culture, plus a comprehensive benefits package.
Key Responsibilities
- Write and optimize SQL queries to pull and analyze data from multiple sources.
- Partner with business teams to clarify questions, define metrics, and deliver actionable insights.
- Build and maintain interactive reports and dashboards to support decision-making (Power BI preferred).
- Ensure data accuracy through validation, cleansing, and reconciliation.
- Document data sources, definitions, and analysis logic to create repeatable, reliable reporting processes.
- Identify opportunities to streamline data workflows, improve automation, and enhance reporting efficiency.
- Communicate findings and trends in clear, business-friendly language to stakeholders.
- Contribute to ad-hoc analysis projects, providing insights to guide business strategy.
5+ years experience:
- Strong SQL experience required with the ability to query and analyze large datasets.
- Experience working with data structures, relational databases, and multiple data sources.
- Experience with data validation, cleansing, and quality assurance.
- Experience with Power BI or other data visualization tools preferred.
- Ability to translate complex data into clear, business-friendly insights.
- Strong communication skills and a consultative approach with stakeholders.
Education: Bachelor’s degree in Business, Analytics, Statistics, or a related field, or equivalent experience
They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).
This is not an SSIS-only role.
The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.
This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.
The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.
This requires architectural thinking and hands-on technical execution.
Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.
Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.
Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.
6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.
6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.
6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.
Electrical Engineer - Data Centers - San Francisco
Metric DCX are partnered with a global engineering and consultancy firm to support the continued growth of their data center division.
This Electrical Engineer position will specialize in data center facility design to be embedded directly with a major end-user client.
Responsibilities:
- Assessing third-party and colocation facilities being considered for acquisition, evaluating their suitability against the client's portfolio requirements.
- Taking ownership of power systems across all project phases, identifying and resolving issues as they arise in collaboration with the relevant client stakeholders.
- Reviewing data center designs with a critical eye on redundancy architecture, availability targets, and potential single points of failure.
- Working closely with operations, planning, and energy strategy teams to push electrical solutions forward on third-party data center projects.
- Conducting technical due diligence and maintaining quality standards in line with client expectations.
- Keeping internal documentation, specs, and standards current based on live project feedback and lessons learned.
- Liaising with internal teams on power loading, rack deployment, and load balancing within shared facilities.
- Contributing to cross-discipline coordination with mechanical and controls engineers, and supporting consistency across regional teams.
Background Required
- Degree-qualified in Electrical Engineering; a postgraduate qualification or PE license would be a strong advantage.
- At least five years working within mission-critical environments, with solid hands-on exposure to colocation and multi-tenant data center projects specifically.
- Confident in power systems analysis and the software tools that come with it.
- Practical experience across the full electrical distribution stack — from high voltage transformers down to branch circuits — covering design, procurement, commissioning, and operations.
- Comfortable working across disciplines and engaging with structural, mechanical, civil, and IT/Telecom teams as needed.
- Grounded in US electrical codes and standards, with some awareness of IEC standards beneficial.
Overview
We are seeking a seasoned Analytics leader to build and lead our enterprise Analytics and Data Governance function in a modern group purchasing / procurement environment. This leader will turn our rich ecosystem of member, supplier, contract, and transaction data into a strategic asset that drives savings, compliance, growth, and differentiated insight for our members and suppliers.
This leader will also own the data governance operating model, enterprise metrics, and analytics roadmap that power member-facing insights, internal performance management, and AI use cases across the technology platform (Website, B2B eCommerce, supplier portal, sourcing tools, and partner integrations).
Key responsibilities
Data governance and policy
- Define and run the enterprise data governance framework covering member, supplier, contract, item, and transaction data domains.
- Establish data ownership and stewardship across functions (Category Management, Supplier Management, Finance, Sales, Marketing, Digital) driving clear accountabilities for data quality and definitions.
- Implement policies for responsible use of data in supplier programs, member reporting, and AI/ML models, ensuring compliance with contractual, regulatory, and privacy requirements.
- Drive data quality management (profiling, remediation, SLAs) for critical assets such as contract price files, item catalogs, rebate/accrual data, and member hierarchies.
- Oversee metadata, business glossary, and data lineage so teams can confidently understand "one source of truth" for core GPO metrics (e.g., committed vs. actual spend, penetration, compliance, savings delivered).
Analytics strategy and delivery
- Define the enterprise analytics vision and roadmap aligned to procurement value levers: spend visibility, category performance, contract compliance, leakage detection, rebate optimization, and supplier performance.
- Lead the design and delivery of standardized KPI suites and dashboards for executives, category teams, supplier partners, and member account teams (e.g., savings scorecards, compliance heatmaps, portfolio optimization).
- Partner with Product and Engineering to ensure the data platform (warehouse, semantic layer, BI tools) can support self-service analytics, embedded insights in member/supplier portals, and AI-driven use cases.
- Champion enterprise metrics and advanced analytics capabilities such as, forecasting, benchmarking, opportunity sizing, and integrity analytics, ensuring models are traceable, governed, and auditable.
- Translate business needs into clear data products (curated data sets, subject-area marts, APIs) that serve both internal teams and external-facing solutions.
Stakeholder leadership and collaboration
- Serve as the enterprise "single point of accountability" for data and analytics, aligning priorities across Technology, Category Management, Supplier Relations, Sales, Finance, and Operations.
- Partner with Supplier and Member-facing teams to co-create analytics offerings that differentiate the GPO (e.g., supplier growth playbooks, member CFO dashboards, public-sector transparency packs).
- Educate executives and business leaders on data literacy, standard metrics, and how to use insights in planning, negotiations, and supplier programs.
- Collaborate closely with Security, Legal, and Compliance to ensure that member and supplier data is used ethically and in line with contracts and regulations.
Team building and operations
- Build and lead a high-performing team of data analysts, analytics engineers, data governance managers, and data stewards.
- Define operating rhythms (data council, data domain forums, metric review cadences) that keep governance and analytics tightly connected to business outcomes.
- Establish and track KPIs for the data function itself (data quality scores, adoption of governed datasets, BI usage, time-to-insight).
- Select and manage key tools and vendors in the analytics and governance ecosystem (warehouse, BI, catalog/governance, quality monitoring).
Qualifications
- Bachelor's or Master's degree in Data/Computer Science, Information Systems, Analytics, Statistics, Business, or related field.
- 10+ years of experience in analytics, data governance, or enterprise data management, including 3–5+ years leading teams.
- Proven experience in a procurement, supply chain, GPO, distribution, or B2B marketplace environment strongly preferred.
- Demonstrated success implementing data governance frameworks and delivering analytics that directly influenced commercial or procurement outcomes (e.g., savings, compliance, supplier growth).
- Hands-on familiarity with modern data platforms (e.g., Snowflake/BigQuery/Redshift, dbt, Power BI/Tableau/Looker, and one or more data catalog/governance tools).
- Strong grasp of regulatory / contractual considerations relevant to member and supplier data (data sharing agreements, use of benchmarking, privacy/security standards).
- Excellent leadership, storytelling, and stakeholder management skills; able to influence at C-suite and board levels.
Attributes for success
- Business-first mindset: instinctively ties data work to member value, supplier value, and financial impact.
- Pragmatic operator: balances governance rigor with speed, enabling innovation rather than blocking it.
- Skilled translator: can convert complex data and AI topics into clear narratives for executives, sales, and category leaders.
- Culture builder: passionate about creating a data-driven culture that values standard definitions, trusted data, and measurable outcomes.
Compensation:
$150,000 to $200,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role include: Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.