Define An Array In Data Structure Jobs in Usa
56,114 positions found — Page 4
Join our Team!
Parkview Health is actively recruiting Board Eligible/Board Certified Neurologists to join our growing Neurosciences Department in Fort Wayne, Indiana .
Specifics of the Role of a Neurologist
Ideal candidates for our Multidisciplinary practice will have an interest in general neurology or sub-specialty interests in Epileptology, MS & Neuroimmunology, Vascular neurology, Movement disorders, Headache management, Neuromuscular disease, Neuro-oncology, Behavioral Neurology or Neurophysiology. Flexible inpatient and outpatient opportunities are available, as well as subspecialty practice development.
Parkview Physicians Group – Neurology Team
- Our diverse team includes 24 Physicians in neurology, neurointensive care, neurointervention, and neurosurgery and 16 Advanced Practice Providers that specialize in the treatment of diseases of the central and peripheral nervous system
- Neurology Consulting only service in the hospital. Separate Neurocritical Care admitting and Consulting Services for our 24 bed Neurocritical care unit.
- Flexible light overnight call schedule
Parkview Health
- Second largest healthcare system in Indiana
- Proudly committed to bringing the highest quality of care to northeast Indiana and northwest Ohio
- Region’s largest employer with over 16,000 employees
- Health system is comprised of more than 1,100 world-class providers in more than 45 specialties in over 300 locations.
- Named one of the nation’s top employers by Forbes
- Parkview physicians and coworkers have earned the following certifications and accreditations:
- Comprehensive Stroke Center (seeing over 2,000 strokes annually)
- PRMC Stroke Center received the highest rating issued by US World News, "High Performing"
- Parkview Health 2021 Get With The Guidelines® - Stroke GOLD PLUS with Honor Roll Elite and Target: Type 2 Diabetes Honor Roll award
- Certified Level II Adult and Pediatric Trauma Center
- Accredited Chest Pain Center
- Comprehensive Stroke Center (seeing over 2,000 strokes annually)
Parkview Regional Medical Center
- $575 million-dollar tertiary medical center opened its doors in 2012
- Beautiful, new 440-bed state-of-the-art facility
- Graduate Medical Education program
- Operates the Samaritan Flight Program and a hospital-based hospitalist and Neurointensivist/eICU® program
Graduate Medical Education Programs
- Flexible teaching opportunities to both residents and medical students
- Established family medicine program
- Parkview welcomed their first Internal Medicine and General Surgery ACGME-accredited residency classes in July 2022 .
- Parkview is committed to continuing the growth of their residency specialty programs . The organization has invested significant resources towards the development of approximately ten residency programs during the next five to seven years.
- These residencies will help fulfill the increasing need for more high-quality physicians in Northeast Indiana , while also ensuring a positive and healthy place to work.
- The new residency programs will primarily be located at Parkview Hospital Randallia in Fort Wayne - a new state of the art academic center- with rotations to Parkview Regional Medical Center and Parkview community hospitals .
- Parkview Health offers great opportunities to conduct research through our Parkview Mirro Center for Research and Innovation or teaching opportunities at the IU School of Medicine Fort Wayne campus.
Benefits
Our excellent benefit package includes:
Highly competitive salaries plus annual incentive compensation opportunity
- Competitive PTO/CME time
- Commencement bonus
- Paid relocation
- Student loan assistance
- Retirement contribution plan
- Flexible spending accounts
- Medical, dental, vision & life insurance
- Short and long-term disability
- And many other non-traditional benefits!
DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.
You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.
At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.
Key Responsibilities
Enterprise Reporting (Hands-On Development)
- Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
- Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
- Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
- Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
- Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
- Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
- Support reporting scalability as our client grows (new factories, new business units, new product lines).
Enterprise Reporting Standards & Governance
- Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
- Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
- Review and approve reporting change requests, data model modifications, and access requests.
- Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
- Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.
Enterprise Data Architecture
- Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
- Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
- Build and govern the centralized semantic model that powers reporting across the company.
- Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
- Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.
Leadership & Collaboration
- Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
- Communicate complex architectural concepts in clear, business-friendly terms.
- Lead cross-functional initiatives that require unified data structures or scalable reporting.
- Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.
Ideal Candidate Profile
- Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
- Track record of establishing and enforcing enterprise reporting standards and governance.
- Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
- Able to operate as both an individual contributor and a strategic leader.
- Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
- Comfortable influencing architectural decisions and guiding technical execution.
- Strong command of foundational tools and languages such as:
- DAX
- Power Query / M
- SQL
- Fabric pipelines / ETL tooling
- Experience with automation and AI-assisted analytics workflows.
Materials Data Specialist
Radiant is seeking a Materials Data Specialist to support the development of our internal materials database. This role focuses on researching, validating, and organizing material property data used by engineering teams across reactor design, thermal systems, and structural analysis.
The ideal candidate has a technical background in materials science, materials engineering, or a related engineering discipline and enjoys working at the intersection of engineering research, data quality, and structured documentation.
You will evaluate the quality of material property sources, organize data into defined schemas, and contribute to documentation that helps engineers confidently use materials data in design and analysis.
Responsibilities
Research Materials Data
- Search scientific literature, databases, and reports to identify relevant material property data.
- Evaluate the quality, reliability, and applicability of material property sources.
- Flag inconsistencies or uncertainty in data sources.
Structure Engineering Data
- Enter material property data into internal databases following defined schemas and standards.
- Maintain consistent formatting and traceability of data sources.
- Ensure data integrity and reproducibility for engineering use.
Document Materials Information
- Write concise descriptions of materials and their properties in supporting reports.
- Summarize relevant test conditions, limitations, and assumptions for engineering teams.
- Maintain clear documentation of data sources and methodologies.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Job Title: Senior Data Engineer / Analytics Engineer
Location: West Los Angeles, CA (Onsite)
Compensation: $180,000 base salary + 10% bonus
Overview
We are looking for a Senior Data Engineer / Analytics Engineer to help architect and build scalable data solutions that power business insights for sales and marketing teams. This role is ideal for someone who enjoys being both strategic and hands-on, designing modern data architectures while actively building pipelines, models, and dashboards.
The ideal candidate has deep experience in modern data stack technologies and has worked closely with high-volume sales and marketing organizations, particularly supporting Salesforce-driven environments.
Key Responsibilities
Data Architecture & Engineering
- Design and build scalable data pipelines and data models that support analytics and reporting across the organization.
- Architect and implement solutions using Snowflake, DBT, Python, and Fivetran within a modern data stack.
- Optimize Snowflake environments for cost and performance, including warehouse configuration, query optimization, and storage strategies.
- Build and maintain robust data transformation pipelines using DBT for modeling, testing, and validation.
Analytics & Business Intelligence
- Develop high-impact dashboards and reporting solutions using Power BI to support decision-making across the business.
- Partner with stakeholders to define KPIs, metrics, and data models that support sales and marketing performance tracking.
- Ensure data reliability, consistency, and accessibility across analytics platforms.
CRM Data & Sales Analytics
- Work extensively with Salesforce data, helping clean, structure, and optimize complex CRM datasets.
- Design scalable data models that support reporting on sales performance, marketing attribution, pipeline analytics, and revenue metrics.
- Implement solutions to improve data quality and usability across CRM-driven reporting.
Business Partnership
- Partner closely with Sales and Marketing teams in a high-volume sales environment to understand reporting needs and deliver actionable insights.
- Translate business questions into scalable data solutions and analytics frameworks.
- Communicate technical concepts clearly to non-technical stakeholders and collaborate effectively across teams.
Required Qualifications
- 5+ years of BI Engineering, Data Engineering, or Analytics Engineering experience.
- Proven experience acting as both a data architect and hands-on builder.
- Strong experience with:
- Snowflake (including cost and performance optimization)
- DBT for transformations, modeling, and data validations
- Python
- Power BI - must have
- Experience working with Salesforce data, including cleaning, structuring, and building scalable reporting solutions for complex CRM datasets. or similar CRM tools.
- Experience supporting Sales and Marketing teams in high-volume sales environments.
- Strong communication skills and ability to work collaboratively with cross-functional stakeholders.
Preferred Qualifications
- Experience with Salesforce data architecture and CRM analytics.
- Background working with large-scale sales operations or marketing analytics teams.
- Experience building modern ELT data pipelines and scalable analytics frameworks.
Work Environment
- Onsite role in West Los Angeles
- Highly collaborative environment working closely with data, sales, marketing, and leadership teams.
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
Senior Data Modeler
Hybrid 3-4 days onsite
Location: Phoenix, Arizona
Salary: $130,000 - $150,000 base
A large, operationally complex organization is undergoing a major modernization of its data platform and is building a new, cloud-native analytics foundation from the ground up. This is a greenfield opportunity for a senior-level data modeler to establish best practices, influence architecture, and help shape how data is organized and used across the business.
This role sits at the center of a multi-year transformation focused on modern analytics, scalable data products, and strong collaboration between data and business teams.
What You’ll Be Working On
- Designing and implementing enterprise data models across conceptual, logical, and physical layers
- Establishing Medallion architecture patterns and reusable modeling assets
- Building dimensional and semantic models that support analytics and reporting
- Partnering closely with domain experts and functional leaders to translate business needs into data structures
- Collaborating with data engineers to align models with ELT pipelines and analytics frameworks
- Helping define modeling standards and upskilling senior engineers in modern data modeling practices
- Contributing hands-on to data engineering work where needed (SQL, transformations, optimization)
- Proactively identifying analytics opportunities and recommending data structures to support them
This role is roughly 40% data modeling, 30% hands-on engineering, and 30% cross-functional collaboration.
Must-Have Experience
- Strong, hands-on experience with data modeling (dimensional, canonical, semantic)
- Deep understanding of Medallion architecture
- Advanced SQL and experience working with a modern cloud data warehouse
- Experience with dbt for transformations and modeling
- Hands-on experience in cloud-native data environments (AWS preferred)
- Ability to work directly with business stakeholders and explain technical concepts clearly
- Experience collaborating closely with data engineers on execution
Nice to Have
- Python experience
- Familiarity with Informatica or reverse-engineering legacy data models
- Exposure to streaming or near-real-time data pipelines
- Experience with visualization tools (tool choice is flexible)
Who Will Thrive in This Role
- A senior individual contributor who enjoys building from scratch
- Someone who can act as a modeling expert and mentor in an organization formalizing this practice
- Comfortable working in ambiguity and taking initiative
- Strong communicator who enjoys partnering with both technical and non-technical teams
- Equally comfortable discussing business concepts and physical data models
Why This Role Is Unique
- Greenfield data modeling initiative with real influence
- Opportunity to define standards that will be used across the organization
- Work on large-scale, real-world operational and analytical data
- High visibility within a growing data organization
- Flexible work setup for individual contributors
If you’re excited about shaping a modern data foundation and want to be the person who defines how data is modeled, understood, and used, this is a rare opportunity to make a lasting impact.
Interested in helping build the next phase of hyperscale data center expansion?
BlueSky Resource Solutions is partnering with a leading infrastructure services provider who is seeking a Regional Director of Data Center Infrastructure to oversee delivery operations within a major hyperscale market.
This role will lead ISP deployments within data center environments while building operational processes, developing field teams, and ensuring high levels of client satisfaction.
The ideal candidate is a hands-on operational leader with experience managing complex infrastructure projects, supporting business growth, and maintaining strong safety, quality, and financial performance standards.
Your project direction:
- Provide leadership for structured cabling and inside plant (ISP) infrastructure projects within large-scale data center environments.
- Oversee project lifecycle activities including planning, staffing, scheduling, quality control, and final project turnover.
- Build and lead field teams including supervisors, technicians, and project support staff.
- Develop and implement operational standards, documentation practices, testing procedures, and installation guidelines aligned with industry standards.
- Maintain strong relationships with enterprise and hyperscale data center customers, ensuring service-level commitments and project milestones are met.
- Collaborate with construction partners, electrical contractors, and facility operations teams to coordinate infrastructure deployment.
- Monitor project financials including labor forecasting, materials planning, change management, and cost control.
- Identify opportunities to improve operational efficiency through standardized processes, prefabrication, and digital reporting tools.
The best fit:
- 8+ years of experience in data center infrastructure, structured cabling, or network deployment environments.
- Experience managing field teams and overseeing multiple projects.
- Demonstrated experience managing project budgets, scheduling, and operational performance metrics.
- Strong client-facing communication and leadership capabilities.
- Industry certifications are considered a plus.
- Ability to meet site access requirements including background screening and safety compliance.
Sr Data & BI Engineer (Hybrid)
We’re partnering with a growing organization seeking a SQL-focused Data & BI Engineer to build and optimize data pipelines, support ETL processes, and drive reporting infrastructure. This role sits at the intersection of data engineering and business intelligence, with strong visibility across teams and leadership.
What You’ll Do
- Design, build, and maintain SQL-based data pipelines and transformations
- Develop and optimize ETL processes to support reporting and analytics
- Write performant SQL for data modeling, transformation, and downstream consumption
- Support and enhance reporting infrastructure (SSRS → Power BI migration)
- Partner with business and technical teams to deliver scalable data solutions
- Improve data quality, structure, and accessibility across systems
- Contribute to performance tuning and optimization of data workflows
What You Bring
- Strong SQL skills with experience in data transformation and pipeline development
- Experience with ETL tools or frameworks (SSIS or similar)
- Exposure to BI tools such as Power BI or SSRS
- Experience working with structured data models in a production environment
- Ability to operate across both data engineering and reporting use cases
Environment
- Hybrid: 3 days onsite
- Evolving data environment with active investment in modernization
- Transitioning reporting stack from SSRS to Power BI
- Collaborative team with dedicated DBA support
Compensation
$120K – $140K base + bonus potential and good benefits
Job Overview:
We are seeking a Data Engineer to support data pipeline development and ETL processes.
Responsibilities:
Build and maintain ETL pipelines
Optimize database performance
Work with structured and unstructured data
Ensure data integrity
Requirements:
Degree in Data Engineering or related field
Knowledge of SQL, Python
Familiarity with data warehousing concepts