Linear Array Definition In Data Structure Jobs Salary Jobs in Usa

24,743 positions found

Data Engineer
✦ New
🏢 Theoris
Salary not disclosed
Indianapolis, IN 10 hours ago

Job Title: Data/Software Engineer


Location: Remote


Industry: Pharmaceutical


***NO C2C***


Job Description:

Theoris Services is assisting our client in their search for a Data/Software Engineer to add to their growing team. Our client is seeking someone with data visualization experience and software engineering (create reusable libraries, best practices, troubleshooting).


Responsibilities:

  • Data Pipeline & Backend Development
  • Design, build, and optimize scalable data pipelines and ETL/ELT processes to integrate and harmonize scientific data (compounds, assays, experiments) from 30+ heterogeneous sources.
  • Implement and maintain lakehouse architectures on AWS (S3, Glue, Athena, Iceberg) to support multibillion-record datasets.
  • Develop federated query capabilities using Trino (or similar distributed engines) for unified access across platforms like PostgreSQL, Snowflake, and others.
  • Build robust backend services, RESTful APIs, and data services using Python (FastAPI, Flask preferred) to enable seamless data flow and integration with scientific tools (e.g., Benchling, computational chemistry systems, AI/ML endpoints).
  • Performance Optimization & Troubleshooting
  • Optimize query and database performance for complex analytical workloads across PostgreSQL, Iceberg, Trino, and other platforms.
  • Implement caching, indexing, and query tuning techniques to improve response times and scalability as data volumes and user bases grow.
  • Apply reverse engineering and advanced troubleshooting skills to debug complex data issues, pipeline bottlenecks, application failures, and performance problems proactively.
  • Monitor systems, identify root causes, and implement fixes for data and application reliability.
  • Data Visualization & User-Facing Analytics
  • Design and develop interactive dashboards, visual analytics, and scientific data visualizations using Power BI and Spotfire (or equivalent tools).
  • Create reusable visualization components and data-rich UIs (React/TypeScript preferred) to enable scientists to search, filter, explore, and interpret complex datasets—including dose-response curves, chemical structures, and analytical results.
  • Translate scientific and engineering data into clear, actionable visual insights for researchers and stakeholders.
  • Software Engineering & Quality Practices
  • Apply best software engineering practices: modular/reusable design, clean code principles, code reviews, comprehensive documentation, and creation of maintainable libraries/services.
  • Write high-quality unit, integration, and end-to-end tests; use mock data effectively to create reliable automated test cases and ensure code stability.
  • Implement CI/CD pipelines for automated testing, deployment, and monitoring on AWS (EC2, ECS, Lambda, S3).
  • Collaborate on full-stack features from database to frontend, ensuring end-to-end functionality, security (SSO/LDAP), and performance.
  • Collaboration & Governance
  • Partner with scientists, UX designers, and cross-functional teams to gather requirements, conduct user testing, and iterate on usability.
  • Implement data validation, quality checks, metadata management, and governance to ensure compliance and accuracy.
  • Contribute to engineering best practices and foster a culture of quality and scalability.


Requirements:

  • Education & Experience
  • Bachelor's degree in Computer Science, Data Engineering, Software Engineering, Information Systems, or a related technical field.
  • 3+ years of professional experience in data engineering, full-stack development, or closely related roles.
  • Proven track record of building and delivering production-grade data pipelines, platforms, and/or user-facing scientific applications.
  • Technical Skills
  • Programming: Intermediate to strong proficiency in Python (core for pipelines, backend, and data manipulation with pandas/PySpark); familiarity with JavaScript/TypeScript for frontend.
  • Data Engineering: Hands-on experience creating scalable pipelines, ETL/ELT processes, and distributed processing (Spark, Trino/Presto).
  • Databases & Querying: Deep expertise in relational databases (PostgreSQL), modern warehouses (Snowflake, Redshift), and query engines; strong focus on query performance improvement and optimization.
  • Cloud Platforms: Practical experience with AWS services (S3, Glue, Athena, Lambda, RDS, EC2/ECS).
  • Data Visualization: Proven experience with Power BI and Spotfire (or similar) for scientific and analytical dashboards/visualizations.
  • Frontend (preferred): Modern JavaScript/TypeScript frameworks (React preferred), responsive UI development, and component libraries.
  • Testing & Quality: Strong unit testing skills; experience writing automated tests with mock data for robust coverage.
  • Tools & Practices: Git for version control; API design (RESTful); CI/CD; clean code and reusable library development.
  • Core Competencies
  • Excellent reverse engineering and troubleshooting capabilities for complex data and system issues.
  • Strong problem-solving skills with attention to detail and commitment to data quality/accuracy.
  • Ability to work independently and collaboratively in cross-functional, scientific teams.
  • Excellent communication skills to bridge technical concepts with non-technical stakeholders (scientists, researchers).


Best-In-Class-Benefits:

We are in the people business; treating people right is our ONLY priority. Theoris Services consultants are full-time employees with full benefits, including:

  • Robust Health Insurance
  • 401(k) plan


About Theoris:

Our goal is to Fuel Your Career! As a Theoris team member, you join a culture based on people-centered values and an environment that fosters both personal and professional growth. We build long-term relationships with our clients and our consultants. With over 30 years of building strong relationships in the industry, we’re uniquely positioned to make the right connections. This knowledge is used to find the right job placement. Our recruiting teams are experts dedicated to the information technology and engineering staffing space and are highly respected by our client base.

Not Specified
Locum Physician (MD/DO) - Psychiatry - General/Other in Indianapolis, IN
Salary not disclosed
Indianapolis, IN 2 days ago


Doctor of Medicine | Psychiatry - General/Other

Location: Indianapolis, IN

Employer:

Pay: Competitive weekly pay (inquire for details)

Start Date: ASAP


About the Position

LocumJobsOnline is working with to find a qualified Psychiatry MD in Indianapolis, Indiana, 46201!

This Job at a Glance

  • Job Reference Id:  ORD-210224-MD-IN
  • Title:  MD
  • Dates Needed:  April 3rd - July 3rd
  • Shift Type:  Day Shift
  • Assignment Type:  Inpatient
  • Call Required:  No
  • Board Certification Required:  No
  • Job Duration:  Locums
About the Facility

This inpatient psychiatric hospital specializes in advanced diagnosis and stabilization of complex mental health cases. The facility provides comprehensive psychiatric services with dedicated resources for intensive treatment of patients requiring specialized inpatient care. The hospital maintains modern documentation systems and interdisciplinary treatment teams to ensure effective delivery of psychiatric services in a structured inpatient environment.

About the Facility Location

Indianapolis features notable attractions including the Indianapolis Motor Speedway Museum and the Children's Museum, along with various specialty museums and walking tour opportunities. The region offers diverse entertainment venues such as the Ruoff Music Center, Everwise Amphitheatre, Clowes Memorial Hall, and Old National Centre, providing options for arts, live music, sports, and shopping experiences. Downtown presents dining and beverage establishments alongside cultural and recreational activities suitable for various interests.

About the Clinician's Workday

The psychiatrist will direct psychiatric services in accordance with institutional policies, utilizing advanced clinical training and professional judgment to guide diagnostic evaluations, treatment plans, and patient care decisions. This position serves as the final authority on psychiatric evaluations while overseeing program planning and reviewing clinical recommendations. The clinician will work part-time for 24 hours per week during day shifts, conducting weekly rounds on a 24-bed inpatient unit and managing complex mental health cases requiring advanced diagnosis and stabilization. Responsibilities include contributing to departmental strategy and agency-wide policy through research and program evaluation while maintaining comprehensive documentation of patient progress and treatment interventions.


Additional Job Details
  • Case Load/PPD:  24 beds / rounding 1x a week
  • Support Staff:  Nursing staff, medical assistants, and administrative support
  • Patient Population:  Adults
  • Location Type:  On-Site
  • Government:  No
  • Shift Hours:  Part time (24 hours)
  • Cases Treated:  Complex mental health cases requiring advanced diagnosis and stabilization
  • Average Length of Stay:  Variable based on patient complexity and treatment needs
  • Census:  24 beds / rounding 1x a week
  • Med Checks/Follow-up per day:  Variable b

    Contact:

About

The need has never been greater to connect great clinicians and great healthcare facilities. That’s what we do. Every day. We’re . We connect clients and clinicians to take care of patients. How do we do it? By doing it better than everyone else. Whether you’re looking for a locum tenens job or locum tenens coverage, our experienced agents have the specialized knowledge, know-how, and personal relationships to take care of you and your search.  


provides comprehensive onboarding and optional 1099 financial consulting from a partner advisor.


 


We cover your malpractice insurance (A++) and provide assistance with credentialing, privileging, licensing, housing and travel.


 


Our agents have the specialized knowledge and personal connections to provide the best locum tenens experience and negotiate top pay on your behalf.


1714077EXPPLAT

permanent
Data Reporting Analyst
🏢 Deploy
Salary not disclosed
Birmingham, AL 3 days ago

DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.

You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.

At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.

Key Responsibilities

Enterprise Reporting (Hands-On Development)

  • Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
  • Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
  • Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
  • Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
  • Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
  • Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
  • Support reporting scalability as our client grows (new factories, new business units, new product lines).

Enterprise Reporting Standards & Governance

  • Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
  • Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
  • Review and approve reporting change requests, data model modifications, and access requests.
  • Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
  • Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.

Enterprise Data Architecture

  • Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
  • Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
  • Build and govern the centralized semantic model that powers reporting across the company.
  • Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
  • Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.

Leadership & Collaboration

  • Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
  • Communicate complex architectural concepts in clear, business-friendly terms.
  • Lead cross-functional initiatives that require unified data structures or scalable reporting.
  • Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.

Ideal Candidate Profile

  • Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
  • Track record of establishing and enforcing enterprise reporting standards and governance.
  • Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
  • Able to operate as both an individual contributor and a strategic leader.
  • Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
  • Comfortable influencing architectural decisions and guiding technical execution.
  • Strong command of foundational tools and languages such as:
  • DAX
  • Power Query / M
  • SQL
  • Fabric pipelines / ETL tooling
  • Experience with automation and AI-assisted analytics workflows.
Not Specified
Data Quality Analyst / Data Steward
Salary not disclosed
Montgomery 2 days ago
Job Requisition: Data Quality Analyst / Data Steward Contract Length: Long Term – Potential renewal each fiscal year Work Location: 100% onsite – Montgomery, AL Candidate Profile Experienced data professional capable of building, advancing, and scaling data quality and governance foundations from scratch.

Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.

Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.

Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.

Drive long term improvement of data standards, definitions, lineage, and quality processes.

Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.

Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.

Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.

Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.

Conduct root cause analysis and implement preventive and long term remediation solutions.

Optimize SQL queries, tune stored procedures, and improve data processing performance.

Document audit findings, validation processes, data flows, standards, and quality reports.

Build dashboards and reports for data quality KPIs using Power BI/Tableau.

Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.

Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.

Ensure proper and consistent data usage across departments and systems.

Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.

Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.

Provide training on data entry, data handling, stewardship practices, and data literacy.

Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.

GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.

Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.

Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.

Drive adoption of data quality and governance practices across business and technical teams.

Support long term evolution of enterprise data strategy and governance maturity.

Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.

SSIS development, deployment, and troubleshooting.

Data profiling, validation rule design, quality scoring, and measurement techniques.

ETL/ELT pipeline design, debugging, and optimization.

Data modeling (conceptual, logical, physical).

Metadata management and lineage documentation.

Reporting and dashboarding with Power BI, Tableau, or similar tools.

Strong documentation and communication skills.

Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.

Experience in low maturity/green field data environments.

Familiarity with AI/ML data readiness and feature store aligned data structuring.

Cloud data engineering exposure (Azure, Databricks, GCP).

Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.

Master’s degree preferred.

Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Not Specified
Associate Partner, Data and Technology Transformation
$250 +
Chicago, IL 2 days ago
Introduction
Your role and responsibilities
About the Opportunity

IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.


You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.


This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.


Key Responsibilities
Market Leadership & Growth

  • Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.


  • Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.


  • Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.



Strategic Advisory & Transformation Delivery

  • Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.


  • Architect integrated solutions that include:


  • Migration from legacy data platforms to modern cloud-based architectures


  • Data engineering and Information governance


  • Business intelligence and advanced analytics


  • GenAI-powered and agentic AI-driven automation and decisioning


  • Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.



Engagement Excellence & Financial Stewardship

  • Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.


  • Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.


  • Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.



Practice Building & Talent Development

  • Recruit, mentor, and grow top-tier consultants, architects, and data specialists.


  • Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.


  • Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.



Thought Leadership & Market Presence

  • Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.


  • Represent IBM at industry conferences, client events, webinars, and executive roundtables.


  • Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.



This position can be preformed anywhere in the US.


"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."


Required technical and professional expertise
Qualifications

  • 12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.


  • Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.


  • Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:


  • AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)


  • Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)


  • Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)


  • Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.


  • Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.


  • Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.


  • Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.


  • Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.


  • Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.


  • Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.



Preferred technical and professional experience

IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.


#J-18808-Ljbffr
Not Specified
Data Integration & AI Engineer
Salary not disclosed
Edison, NJ 2 days ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
Data Quality Analyst
✦ New
Salary not disclosed
Juno Beach, FL 10 hours ago

Job Description


The Data Quality Analyst / Databricks Implementation Specialist plays a key role in advancing the company’s enterprise data governance and Databricks Lakehouse strategy. This role partners closely with business data stewards, data owners, and technical teams to translate business data requirements into governed, high-quality datasets within Databricks Unity Catalog. The analyst will support domain onboarding, develop and operationalize data quality rules, perform profiling and analysis, and help implement enterprise standards for metadata, lineage, and semantic consistency.


Key Responsibilities


  • Data Quality & Profiling
  • Develop, document, and maintain data quality rules for critical data elements (CDEs).
  • Perform data profiling, anomaly detection, and root-cause analysis.
  • Partner with data stewards to validate definitions, thresholds, and business rules.
  • Monitor and report on data quality metrics and remediation progress.
  • Databricks Unity Catalog Implementation
  • Support Unity Catalog rollout across domains, including catalog structure, tagging, and metadata standards.
  • Assist with onboarding domains into the Bronze → Silver → Gold architecture.
  • Ensure lineage, ownership, and quality rules are embedded into Databricks pipelines.
  • Help implement domain-aligned access controls and sensitivity tagging.
  • Collaboration with Data Stewards & Business Partners
  • Work directly with business data stewards to understand data requirements and quality expectations.
  • Translate business meaning into standardized CDEs and steward-approved metadata.
  • Facilitate working sessions to align on semantics, domain boundaries, and data product requirements.
  • Support consistent governance practices across domains.
  • Metadata, Lineage, and Catalog Management
  • Maintain high-quality metadata in the enterprise data catalog.
  • Ensure CDEs, KPIs, and domain terms are accurately documented.
  • Validate lineage from raw sources through refined layers.
  • Data Analysis & Issue Resolution
  • Investigate data issues raised by business users or downstream consumers.
  • Perform impact analysis for schema changes or quality rule updates.
  • Support remediation efforts with engineering and business teams.


Required Skills & Experience


3–5 years of experience in data quality, data governance, or data analysis.

Hands-on experience with Databricks, Delta Lake, or similar cloud platforms.

Strong understanding of data quality concepts.

Experience with metadata catalogs or governance tools.

Proficiency with SQL and data analysis.

Strong communication skills.


Nice to Have Skills & Experience


Experience with Databricks Unity Catalog.

Familiarity with Medallion Architecture.

Exposure to governance frameworks (DAMA, DCAM).

Experience collaborating with data stewards or data owners.

Knowledge of data modeling or semantic layers.


Pay Rate depending on background and experience ranging from $35-43/hr

Not Specified
Data Analyst
✦ New
Salary not disclosed
Des Moines, IA 1 day ago

This is a full-time position that requires onsite presence in Des Moines, Iowa. Candidates must be authorized to work in the United States without sponsorship now or in the future.


P3+Uplift is partnering with a local insurance company to find a SQL-driven Data Analyst who enjoys working directly with business stakeholders to turn data questions into clear insights and reporting. This role is highly hands-on with SQL and data extraction, working across multiple data sources to support reporting, analysis, and data-driven decision making. The ideal candidate is both analytical and consultative—able to understand business needs, write efficient queries, and deliver clear, actionable insights.


The company offers a flexible schedule, hybrid work environment, casual dress code, and a collaborative culture, plus a comprehensive benefits package.


Key Responsibilities

  • Write and optimize SQL queries to pull and analyze data from multiple sources.
  • Partner with business teams to clarify questions, define metrics, and deliver actionable insights.
  • Build and maintain interactive reports and dashboards to support decision-making (Power BI preferred).
  • Ensure data accuracy through validation, cleansing, and reconciliation.
  • Document data sources, definitions, and analysis logic to create repeatable, reliable reporting processes.
  • Identify opportunities to streamline data workflows, improve automation, and enhance reporting efficiency.
  • Communicate findings and trends in clear, business-friendly language to stakeholders.
  • Contribute to ad-hoc analysis projects, providing insights to guide business strategy.


5+ years experience:

  • Strong SQL experience required with the ability to query and analyze large datasets.
  • Experience working with data structures, relational databases, and multiple data sources.
  • Experience with data validation, cleansing, and quality assurance.
  • Experience with Power BI or other data visualization tools preferred.
  • Ability to translate complex data into clear, business-friendly insights.
  • Strong communication skills and a consultative approach with stakeholders.


Education: Bachelor’s degree in Business, Analytics, Statistics, or a related field, or equivalent experience

Not Specified
Data Analyst Manager
✦ New
Salary not disclosed
Hickory, NC 1 day ago

Who We Are

At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.

What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.

Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?


Role Summary:

The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.


Responsibilities:

Data Architecture & Tooling

  • Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
  • Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
  • Ensure alignment between data infrastructure and evolving business needs across distribution operations
  • Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite

Data Governance & Quality

  • Define and enforce data standards, metric definitions, and naming conventions across all business domains
  • Establish data ownership, lineage documentation, and access governance policies
  • Implement monitoring and alerting for data quality issues across source systems and the warehouse
  • Build and maintain a data dictionary that serves as the single source of truth for the organization

Orchestration of Analysts & Systems

  • Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
  • Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
  • Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
  • Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business


Qualifications:

Required

  • 3-5 years of experience in data analytics, business intelligence, or data engineering
  • 2+ years in a lead or management role overseeing analysts or data team members
  • Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
  • Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
  • Proven track record establishing data governance standards and documentation practices
  • Experience integrating and managing multiple data sources across SaaS and ERP platforms
  • Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders

Preferred

  • Experience in distribution, wholesale, or supply chain environments
  • Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
  • Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
  • Background working within IT-led or hybrid IT/Analytics teams


Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life & Disability insurance
  • 401(K) with company match


Company Paid holidays and PTO:

  • Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
  • We offer 10 company paid holidays and 1 floating holiday per year.


Perks:

  • Parking provided (Charlotte office and onsite at Hickory office)
  • Employee Engagement team
  • Monthly stipend to pursue an active lifestyle


Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.

Not Specified
Product Data Analyst
Salary not disclosed
Dallas, TX 2 days ago

Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.


Security Advisory: Beware of Frauds

Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.


We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.


Responsibilities

  • Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
  • Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
  • Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
  • API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
  • Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
  • Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.


What You Will NOT Do (The Boundaries)

  • No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
  • No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
  • No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.


Intersection with Technical Teams

  • With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
  • With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.


Experience, Skills, & Ability Requirements

  • 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
  • SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
  • API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
  • Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
  • You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
  • You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.


Bonus Points (Nice-to-Haves)

  • Familiarity with Visio/Lucidchart to visualize data flows.
  • Ability to build simple dashboards in Tableau to track data health scores.
  • Basic familiarity with Python or R for data manipulation.


What We Offer

  • Health, dental, and vision benefits
  • Paid parental leave
  • 401(k) with employer match
  • A culture of meritocracy that fosters ongoing growth opportunities
  • A stable, growing family-owned company that looks after its employees


Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.

Not Specified
jobs by JobLookup
✓ All jobs loaded