Data Migration Best Practices Jobs in Usa
12,762 positions found — Page 4
Company Description
Press Ganey is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
Better together:We check our egos at the door. We work together, so we win together.
We are seeking an experienced Staff Data Engineer to join our Unified Data Platform team. The ideal candidate will design, develop, and maintain enterprise-scale data infrastructure leveraging Azure and Databricks technologies. This role involves building robust data pipelines, optimizing data workflows, and ensuring data quality and governance across the platform. You will collaborate closely with analytics, data science, and business teams to enable data-driven decision-making.
Duties & Responsibilities:
- Design, build, and optimizedata pipelinesand workflows inAzureandDatabricks, including Data Lake and SQL Database integrations.
- Implement scalableETL/ELT frameworksusingAzure Data Factory,Databricks, andSpark.
- Optimize data structures and queries for performance, reliability, and cost efficiency.
- Drivedata quality and governance initiatives, including metadata management and validation frameworks.
- Collaborate with cross-functional teams to define and implementdata modelsaligned with business and analytical requirements.
- Maintain clear documentation and enforce engineering best practices for reproducibility and maintainability.
- Ensure adherence tosecurity, compliance, and data privacystandards.
- Mentor junior engineers and contribute to establishingengineering best practices.
- SupportCI/CD pipeline developmentfor data workflows using GitLab or Azure DevOps.
- Partner with data consumers to publish curated datasets into reporting tools such asPower BI.
- Stay current with advancements inAzure, Databricks, Delta Lake, and data architecture trends.
Technical Skills:
- Advanced proficiency inAzure 5+ years(Data Lake, ADF, SQL).
- Strong expertise inDatabricks (5+ years),Apache Spark (5+ years), andDelta Lake (5+ years).
- Proficient inSQL (10+ years)andPython (5+ years); familiarity withScalais a plus.
- Strong understanding ofdata modeling,data governance, andmetadata management.
- Knowledge ofsource control (Git),CI/CD, and modern DevOps practices.
- Familiarity withPower BIvisualization tool.
Minimum Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 7+ yearsof experience in data engineering, with significant hands-on work incloud-based data platforms (Azure).
- Experience buildingreal-time data pipelinesand streaming frameworks.
- Strong analytical and problem-solving skills.
- Proven ability tolead projectsand mentor engineers.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Master's degree in Computer Science, Engineering, or a related field.
- Exposure tomachine learning integrationwithin data engineering pipelines.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $110,000 to $170,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/
Job Title: Senior Data Engineer
Location: Chicago, IL (Hybrid)
Department: Data & Analytics
Reports To: Head of Data Engineering / Data Platform Lead
Role Overview
We are seeking a highly skilled Senior Data Engineer with strong Python development expertise and deep experience in Snowflake to design, build, and optimize scalable enterprise data solutions. This role is based in Chicago, IL and will support regulatory and risk data initiatives in a highly governed environment.
The ideal candidate has hands-on experience building modern cloud data platforms and is familiar with risk management frameworks, BCBS 239 principles, and Governance, Risk & Compliance (GRC) requirements within financial services.
Key Responsibilities
Data Engineering & Architecture
Design, develop, and maintain scalable data pipelines using Python.
Build and optimize data models, transformations, and data marts within Snowflake.
Develop robust ELT/ETL frameworks for structured and semi-structured data.
Optimize Snowflake performance, cost efficiency, clustering, and workload management.
Implement automation, monitoring, and CI/CD for data pipelines.
Risk & Regulatory Data Management
Support regulatory reporting aligned with BCBS 239 (risk data aggregation and reporting).
Ensure data traceability, lineage, reconciliation, and auditability.
Implement controls aligned with Governance, Risk & Compliance (GRC) frameworks.
Partner with Risk, Finance, Compliance, and Audit teams to deliver accurate and governed data assets.
Data Governance & Quality
Develop and enforce data quality validation frameworks.
Maintain metadata, lineage documentation, and data catalog integration.
Implement data access controls and security best practices.
Technical Leadership
Provide mentorship and code reviews for data engineering team members.
Promote engineering best practices and documentation standards.
Collaborate cross-functionally with architects, analysts, and business stakeholders.
Required Qualifications
7+ years of experience in Data Engineering or Data Platform development.
Strong Python programming expertise (Pandas, PySpark, Airflow, etc.).
Hands-on experience with Snowflake (data modeling, Snowpipe, Streams & Tasks, performance tuning).
Advanced SQL skills and deep understanding of data warehousing concepts.
Experience supporting BCBS 239 compliance or similar regulatory reporting frameworks.
Experience working within Governance, Risk & Compliance (GRC) structures.
Experience in cloud environments (AWS, Azure, or GCP).
Strong understanding of data lineage, controls, reconciliation, and audit requirements.
Preferred Qualifications
Experience in banking, capital markets, or financial services.
Knowledge of credit risk, market risk, liquidity risk, or regulatory reporting domains.
Experience with data governance tools (Collibra, Alation, etc.).
Familiarity with DevOps practices, Docker, Kubernetes.
Experience building enterprise data platforms in highly regulated environments.
Key Competencies
Strong problem-solving and analytical thinking.
Ability to operate in a regulated, audit-driven environment.
Excellent communication and stakeholder management skills.
Detail-oriented with a focus on data accuracy and integrity.
Leadership mindset with hands-on technical capability.
Position Summary:
The Scientific Computing and Data group at the Icahn School of Medicine at Mount Sinai partners with scientists to accelerate scientific discovery. To achieve these aims, we support a cutting-edge high-performance computing and data ecosystem along with MD/PhD-level support for researchers. The group is composed of a high-performance computing team, a clinical data warehouse team and a data services team.
The Lead HPC Architect, Cybersecurity, High Performance Computational and Data Ecosystem, is responsible for designing, implementing, and managing the cybersecurity infrastructure and technical operations of Scientific Computing’s computational and data science ecosystem. This ecosystem includes a 25,000+ core and 40+ petabyte usable high-performance computing (HPC) systems, clinical research databases, and a software development infrastructure for local and national projects. The HPC system is the fastest in the world at any academic biomedical center (Top 500 list).
To meet Sinai’s scientific and clinical goals, the Lead brings a strategic, tactical and customer-focused vision to evolve the ecosystem to be continually more resilient, secure, scalable and productive for basic and translational biomedical research. The Lead combines deep technical expertise in cybersecurity, HPC systems, storage, networking, and software infrastructure with a strong focus on service, collaboration, and strategic planning for researchers and clinicians throughout the organization and beyond. The Lead is an expert troubleshooter, productive partner and leader of projects. The lead will work with stakeholders to make sure the HPC infrastructure is in compliance with governmental funding agency requirements and to promote efficient resource utilizations for researchers
This position reports to the Director for HPC and Data Ecosystem in Scientific Computing and Data.
Key Responsibilities:
HPC Cybersecurity & System Administration:
- Design, implement, and manage all cybersecurity operations within the HPC environment, ensuring alignment with industry standards (NIST, ISO, GDPR, HIPAA, CMMC, NYC Cyber Command, etc.).
- Implement best practices for data security, including but not limited to encryption (at rest, in transit, and in use), audit logging, access control, authentication control, configuration managements, secure enclaves, and confidential computing.
- Perform full-spectrum HPC system administration: installation, monitoring, maintenance, usage reporting, troubleshooting, backup and performance tuning across HPC applications, web service, database, job scheduler, networking, storage, computes, and hardware to optimize workload efficiency.
- Lead resolution of complex cybersecurity and system issues; provide mentorship and technical guidance to team members.
- Ensure that all designs and implementations meet cybersecurity, performance, scalability, and reliability goals. Ensure that the design and operation of the HPC ecosystem is productive for research.
- Lead the integration of HPC resources with laboratory equipment for data ingestion aligned with all regulatory such as genomic sequencers, microscopy, clinical system etc.
- Develop, review and maintain security policies, risk assessments, and compliance documentation accurately and efficiently.
- Collaborate with institutional IT, compliance, and research teams to ensure all regulatory, Sinai Policy and operational alignment.
- Design and implement hybrid and cloud-integrated HPC solutions using on-premise and public cloud resources.
- Partner with other peers regionally, nationally and internationally to discover, propose and deploy a world-class research infrastructure for Mount Sinai.
- Stay current with emerging HPC, cloud, and cybersecurity technologies to keep the organization’s infrastructure up-to-date.
- Work collaboratively, effectively and productively with other team members within the group and across Mount Sinai.
- Provide after-hours support as needed.
- Perform other duties as assigned or requested.
Requirements:
- Bachelor’s degree in computer science, engineering or another scientific field. Master's or PhD preferred.
- 10 years of progressive HPC system administration experience with Enterprise Linux releases including RedHat/CentOS/Rocky Systems, and batch cluster environment.
- Experience with all aspects of high-throughput HPC including schedulers (LSF or Slurm), networking (Infiniband/Gigabit Ethernet), parallel file systems and storage, configuration management systems (xCAT, Puppet and/or Ansible), etc.
- Proficient in cybersecurity processes, posture, regulations, approaches, protocols, firewalls, data protection in a regulated environment (e.g. finance, healthcare).
- In-depth knowledge HIPAA, NIST, FISMA, GDPR and related compliance standards, with prove experience building and maintaining compliant HPC system
- Experience with secure enclaves and confidential computing.
- Proven ability to provide mentorship and technical leadership to team members.
- Proven ability to lead complex projects to completion in collaborative, interdisciplinary settings with minimum guidance.
- Excellent analytical ability and troubleshooting skills.
- Excellent communication, documentation, collaboration and interpersonal skills. Must be a team player and customer focused.
- Scripting and programming experience.
Preferred Experience
- Proficient with cloud services, orchestration tools, openshift/Kubernetes cost optimization and hybrid HPC architectures.
- Experience with Azure, AWS or Google cloud services.
- Experience with LSF job scheduler and GPFS Spectrum Scale.
- Experience in a healthcare environment.
- Experience in a research environment is highly preferred.
- Experience with software that enables privacy-preserving linking of PHI.
- Experience with Globus data transfer.
- Experience with Web service, SAP HANA, Oracle, SQL, MariaDB and other database technologies.
Strength through Unity and Inclusion
The Mount Sinai Health System is committed to fostering an environment where everyone can contribute to excellence. We share a common dedication to delivering outstanding patient care. When you join us, you become part of Mount Sinai’s unparalleled legacy of achievement, education, and innovation as we work together to transform healthcare. We encourage all team members to actively participate in creating a culture that ensures fair access to opportunities, promotes inclusive practices, and supports the success of every individual.
At Mount Sinai, our leaders are committed to fostering a workplace where all employees feel valued, respected, and empowered to grow. We strive to create an environment where collaboration, fairness, and continuous learning drive positive change, improving the well-being of our staff, patients, and organization. Our leaders are expected to challenge outdated practices, promote a culture of respect, and work toward meaningful improvements that enhance patient care and workplace experiences. We are dedicated to building a supportive and welcoming environment where everyone has the opportunity to thrive and advance professionally. Explore this opportunity and be part of the next chapter in our history.
About the Mount Sinai Health System:
Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 48,000 employees working across eight hospitals, more than 400 outpatient practices, more than 300 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex health care challenges of our time — discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it. Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive health care solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients’ medical and emotional needs at the center of all treatment. The Health System includes more than 9,000 primary and specialty care physicians; 13 joint-venture outpatient surgery centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and more than 30 affiliated community health centers. We are consistently ranked by U.S. News & World Report's Best Hospitals, receiving high "Honor Roll" status.
Equal Opportunity Employer
The Mount Sinai Health System is an equal opportunity employer, complying with all applicable federal civil rights laws. We do not discriminate, exclude, or treat individuals differently based on race, color, national origin, age, religion, disability, sex, sexual orientation, gender, veteran status, or any other characteristic protected by law. We are deeply committed to fostering an environment where all faculty, staff, students, trainees, patients, visitors, and the communities we serve feel respected and supported. Our goal is to create a healthcare and learning institution that actively works to remove barriers, address challenges, and promote fairness in all aspects of our organization.
DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.
You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.
At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.
Key Responsibilities
Enterprise Reporting (Hands-On Development)
- Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
- Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
- Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
- Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
- Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
- Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
- Support reporting scalability as our client grows (new factories, new business units, new product lines).
Enterprise Reporting Standards & Governance
- Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
- Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
- Review and approve reporting change requests, data model modifications, and access requests.
- Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
- Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.
Enterprise Data Architecture
- Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
- Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
- Build and govern the centralized semantic model that powers reporting across the company.
- Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
- Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.
Leadership & Collaboration
- Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
- Communicate complex architectural concepts in clear, business-friendly terms.
- Lead cross-functional initiatives that require unified data structures or scalable reporting.
- Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.
Ideal Candidate Profile
- Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
- Track record of establishing and enforcing enterprise reporting standards and governance.
- Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
- Able to operate as both an individual contributor and a strategic leader.
- Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
- Comfortable influencing architectural decisions and guiding technical execution.
- Strong command of foundational tools and languages such as:
- DAX
- Power Query / M
- SQL
- Fabric pipelines / ETL tooling
- Experience with automation and AI-assisted analytics workflows.
Be the one who makes a difference!
At Vertex Education we are a team of high achievers, courageous leaders, and passionate believers in changing lives through education. As a purpose-led education services provider, our mission is destined to benefit many and yet it starts with just one person inspired to work together with us to make a memorable and meaningful difference for our clients, schools, students, and communities. Be the one who makes a difference—with us.
The Marketing Analytics Analyst supports Legacy Traditional Schools by transforming marketing and enrollment data into actionable insights that improve student recruitment and family engagement. This role integrates data from multiple platforms, develops clear and effective dashboards, and delivers analysis that helps the marketing team make smarter, faster decisions.
Reporting to the Director of Business Intelligence, the Marketing Analytics Analyst serves as a strategic partner to marketing leadership by improving data quality, clarifying performance metrics, and identifying opportunities to optimize campaigns, resource allocation, and enrollment outcomes. This role helps ensure marketing efforts are measurable, efficient, and continuously improving so more families can find and connect with the educational opportunities Legacy provides.
Essential Functions:
1. Marketing Data Management and Governance:
- Collect, integrate, and validate data from web analytics, CRM, paid media, SIS, application, and marketing automation platforms.
- Own and maintain marketing data integrations and reporting workflows across tools such as Google Analytics, HubSpot, SchoolMint, and student information systems.
- Define, document, and maintain standardized marketing metrics, reporting logic, and data governance practices.
- Ensure marketing data is accurate, consistent, and reliable across platforms and reporting outputs.
2. Marketing Analytics and Insights:
- Analyze campaign performance, audience behavior, lead flow, and enrollment conversion trends to identify actionable opportunities.
- Design, support, and evaluate A/B tests to improve campaign effectiveness and inform future strategy.
- Develop forecasts related to lead volume, conversion, enrollment trends, and marketing performance.
- Track and interpret key performance metrics such as cost per lead, conversion rates, application yield, and enrollment outcomes.
- Translate complex data into clear insights and practical recommendations for marketing and business leaders.
3. Reporting and Visualization:
- Build, maintain, and enhance dashboards and reports that communicate marketing performance to stakeholders.
- Automate recurring reporting processes to improve efficiency, reduce manual effort, and increase accuracy.
- Tailor reporting views and analyses to meet the needs of marketing leadership and cross-functional partners.
- Present findings in a clear, compelling, and decision-oriented manner.
4. Financial and Performance Analysis:
- Monitor campaign budgets, pacing, and performance against plan.
- Evaluate the return on investment of paid media and broader marketing initiatives.
- Identify opportunities to improve efficiency and maximize enrollment impact per dollar spent.
- Partner with marketing leaders to refine strategy based on financial, operational, and performance data.
5. Continuous Improvement and Innovation:
- Stay current on marketing analytics tools, trends, and best practices.
- Recommend and implement process improvements, tools, and analytical approaches that strengthen marketing decision-making.
- Identify opportunities to streamline internal workflows, improve reporting usability, and increase data accessibility.
- Support ongoing innovation in marketing measurement and analysis to better advance student recruitment goals.
Required Qualifications:
- Bachelor’s degree in Marketing, Data Analytics, Statistics, Business, or a related field.
- Minimum of 3 years of experience in marketing analytics, campaign analysis, business intelligence, or a related data-focused role.
- Proficiency in SQL and at least one programming language, such as Python or R.
- Hands-on experience with web analytics platforms, CRM systems, and marketing automation tools.
- Experience with data visualization and reporting tools such as Tableau, Power BI, Looker, or similar platforms.
- Strong understanding of data quality, governance, and metric standardization best practices.
- Demonstrated ability to synthesize data into actionable business insights and communicate findings effectively to non-technical stakeholders.
Preferred Qualifications:
- Certifications in Google Analytics, HubSpot, or related marketing analytics platforms.
- Experience with student information systems such as Infinite Campus or PowerSchool.
- Experience with application or enrollment platforms such as SchoolMint.
- Familiarity with paid media, programmatic advertising, and digital campaign measurement.
- Advanced Excel skills, including modeling, scenario analysis, and data manipulation
Be excited to be a part of our team and grow your career with us!
Be the one who enables us to positively impact over 258,000 students across multiple states while driving our growth forward so we can enrich even more lives. Be the one who helps us achieve excellence for over 226 schools that we support with academics, finance, technology, human resources, communications, marketing, facilities, construction, and food services. Be the one who is a diverse thinker, a team player, a smart risk taker, an innovator, and a difference maker by encouraging others to climb higher and reach farther to further education.
- Be yourself surrounded by wonderful people who care about you, value your unique skills, and lift you up.
- Be supported in your work by caring leaders and team members who want you to succeed.
- Be empowered to make a difference and climb higher and reach farther to change lives through education.
- Be well in all aspects of your life from your physical, mental, and emotional wellbeing to your finances.
- Enjoy industry-leading pay, rewards, referral bonuses, with unlimited flexible paid time-off for performance.
- Be able to care for your health and your family with comprehensive medical, dental and vision benefits and invest in your future with 401(k) plans with a 6% employer match on your contributions.
- Enhance your growth and development with mentoring and money to take training classes.
- Thrive in a welcoming, supportive, and inclusive environment where we treat others with fairness and respect, celebrate diversity, and elevate equality and inclusion as an equal opportunity employer.
Be the one who makes a difference!
With an innovative mind, a hungry heart, and engaging spirit you can change lives through education. Be a part of Vertex Education and let’s make a difference together. Apply Today!
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Who We Are
At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.
What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.
Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?
Role Summary:
The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.
Responsibilities:
Data Architecture & Tooling
- Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
- Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
- Ensure alignment between data infrastructure and evolving business needs across distribution operations
- Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite
Data Governance & Quality
- Define and enforce data standards, metric definitions, and naming conventions across all business domains
- Establish data ownership, lineage documentation, and access governance policies
- Implement monitoring and alerting for data quality issues across source systems and the warehouse
- Build and maintain a data dictionary that serves as the single source of truth for the organization
Orchestration of Analysts & Systems
- Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
- Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
- Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
- Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business
Qualifications:
Required
- 3-5 years of experience in data analytics, business intelligence, or data engineering
- 2+ years in a lead or management role overseeing analysts or data team members
- Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
- Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
- Proven track record establishing data governance standards and documentation practices
- Experience integrating and managing multiple data sources across SaaS and ERP platforms
- Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders
Preferred
- Experience in distribution, wholesale, or supply chain environments
- Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
- Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
- Background working within IT-led or hybrid IT/Analytics teams
Benefits:
- Health insurance
- Dental insurance
- Vision insurance
- Life & Disability insurance
- 401(K) with company match
Company Paid holidays and PTO:
- Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
- We offer 10 company paid holidays and 1 floating holiday per year.
Perks:
- Parking provided (Charlotte office and onsite at Hickory office)
- Employee Engagement team
- Monthly stipend to pursue an active lifestyle
Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.
The Business Data Analyst will play a critical role in supporting data-driven decision-making for core PMA business functions. This position is focused on extracting valuable insights from complex datasets, creating operational reports, and developing intuitive BI dashboards tailored to business needs. Working within an enterprise reporting structure, the analyst will perform on-demand data discovery, conduct trend analysis, and develop analytics tools that empower stakeholders with meaningful insights. By ensuring data accuracy, quality and relevance, this role will support data governance activities and continuous process improvements that align with strategic objectives.
Responsibilities:
Data Analysis & Business Insights
* Conduct in-depth data analysis to support strategic business initiatives.
* Perform trend analysis and develop predictive insights to help business teams identify patterns, risks, and opportunities.
* Respond to data discovery requests and operational reports development to support key business metrics and decision-making.
* Deploy best practices and make recommendations for improved understanding.
* Translate complex data findings into actionable recommendations, presenting insights in a clear and meaningful way for non-technical stakeholders.
Enterprise Reporting & BI Dashboard Development
* Work closely with business stakeholders to understand their reporting needs, providing insights that drive data-informed decisions.
* Design, develop, and maintain interactive BI dashboards tailored to answering critical business questions, providing real-time access to critical metrics and performance insights.
* Utilize enterprise BI tools to create data visualizations that enable easy exploration of data and insights.
* Partner with stakeholders to test and refine dashboards, ensuring they align with business requirements and enhance decision-making capabilities.
* Facilitate training and support for business users on BI dashboards and reporting tools, enabling self-service access to data insights.
Data Quality Support & Validation
* Collaborate with data governance and data engineering teams to ensure high data quality and integrity in enterprise reports and dashboards.
* Perform data validation and verification as part of report development to ensure data accuracy, consistency, and relevance for business users.
* Monitor data accuracy metrics and support data issue resolution, maintaining a high standard of data quality across reporting tools.
* Demonstrate commitment to Company's Code of Business Conduct and Ethics, and apply knowledge of compliance policies and procedures, standards and laws applicable to job responsibilities in the performance of work.
Requirements:
* 3+ years of experience in data, analytics, or business intelligence.
* Bachelor's degree in Information Management, Data Science, Computer Science, Mathematics, Statistics, Economics, Psychology or a related field.
* Proficient in SQL for data extraction and manipulation across various data sources.
* Strong analytical skills to interpret complex datasets and draw actionable insights.
* Experience with BI platforms like QlikSense or Power BI for data visualization and dashboard development.
* Familiar with advanced Excel functions for data manipulation and reporting.
* Understanding of statistical methods and trend analysis for identifying patterns and creating projections.
* Familiar with predictive modeling or basic machine learning concepts is a plus.
* Proficiency with scripting languages or tools (such as Python, R, or VBA) for process automation is a plus.
* Basic understanding of data integration, ETL processes, and data warehousing concepts.
* Skilled in presenting data in a way that tells a compelling story and drives informed decision-making.
* Strong interpersonal skills to work effectively with cross-functional teams in underwriting, finance, and IT.
* High level of precision in data analysis, ensuring reports and insights are accurate and free of errors.
* Analytical mindset to investigate data challenges, identify root causes, and develop efficient solutions.
* Ability to adapt to evolving data requirements and troubleshoot issues with minimal supervision.
* Strong organizational skills to balance multiple projects and meet reporting deadlines.
* Effective time management to handle ad hoc requests and prioritize tasks in a fast-paced environment.
* Open and motivated to learn new tools, methods, and data practices.
The Data Engineering Manager is responsible for leading and developing a team of Data Architects and Data Solutions Engineers while actively contributing to hands-on technical projects. This role will manage the data warehouse in Snowflake, engineering automations in Alteryx and/or other solutions, while ensuring efficient project intake and prioritization. The ideal candidate combines strong technical expertise with proven technical leadership skills to drive innovation and operational excellence across the data engineering function.
As a Data Engineering Manager, you will:
- Set the technical strategy for data engineering solutions and data architecture which includes end to end data pipeline strategy, consumption management, project scoping, and data automation.
- Design, develop, and optimize data engineering solutions using Snowflake, DBT, Azure Data Factory, and Alteryx.
- Continuously assess and optimize the data engineering technology stack to ensure scalability, performance, and alignment with industry best practices.
- Implement best practices for data modeling, ETL/ELT processes, and automation.
- Own and maintain the Snowflake data warehouse roadmap and engineering standards.
- Lead data project scoping, prioritization, and resource allocation to ensure timely delivery of data engineering solutions.
- Ensure data integrity, security, and compliance across all engineering solutions.
- Collaborate with IT and rest of data teams to align solutions with enterprise
- Establish documentation and governance standards for data engineering workflows ensuring completeness, audit readiness, and traceability in alignment with enterprise architecture.
- Directly supervise the Data Architecture & Data Engineering team in accordance with Nicolet's policies and applicable laws. Responsibilities include interviewing, hiring, and training employees; planning, assigning, and directing work; appraising performance; coaching, mentoring and development planning; rewarding and disciplining employees; addressing complaints and resolving problems.
Qualifications:
- Bachelor's degree in Computer Science, Data Engineering, Data Analytics or related field.
- 7+ years in data engineering or related data roles required.
- 3+ years in leadership or management positions required.
- Strong technical expertise in Snowflake, DBT, Azure Data Factory and SQL or like systems.
- Familiarity with Alteryx, UiPath, Tableau, Power BI and Salesforce is preferred.
- Ability to design and implement scalable data solutions.
- Excellent leadership, communication, and organizational skills
- Ability to balance hands-on development with team development.
- Must be able to work fully in-office. This position does not allow for remote work.
Benefits:
- Medical, Dental, Vision, & Life Insurance
- 401(k) with a company match
- PT0 & 11 1/2 Paid Holidays
The above statements are intended to describe the general nature and level of work being performed. They are not intended to be construed as an exhaustive list of all responsibilities and skills required for the position.
Equal Opportunity Employer/Veterans/Disabled
LocationAtlanta, Georgia
Full/Part TimeFull-Time
Regular/TemporaryRegular
Add to Favorite JobsEmail this Job
About Us
Overview
Georgia Tech prides itself on its technological resources, collaborations, high-quality student body, and its commitment to building an outstanding and diverse community of learning, discovery, and creation. We strongly encourage applicants whose values align with our institutional values, as outlined in our Strategic Plan. These values include academic excellence, diversity of thought and experience, inquiry and innovation, collaboration and community, and ethical behavior and stewardship. Georgia Tech has policies to promote a healthy work-life balance and is aware that attracting faculty may require meeting the needs of two careers.
About Georgia Tech
Georgia Tech is a top-ranked public research university situated in the heart of Atlanta, a diverse and vibrant city with numerous economic and cultural strengths. The Institute serves more than 45,000 students through top-ranked undergraduate, graduate, and executive programs in engineering, computing, science, business, design, and liberal arts. Georgia Tech's faculty attracted more than $1.4 billion in research awards this past year in fields ranging from biomedical technology to artificial intelligence, energy, sustainability, semiconductors, neuroscience, and national security. Georgia Tech ranks among the nation's top 20 universities for research and development spending and No. 1 among institutions without a medical school.
Georgia Tech's Mission and Values
Georgia Tech's mission is to develop leaders who advance technology and improve the human condition. The Institute has nine key values that are foundational to everything we do:
1. Students are our top priority.
2. We strive for excellence.
3. We thrive on diversity.
4. We celebrate collaboration.
5. We champion innovation.
6. We safeguard freedom of inquiry and expression.
7. We nurture the wellbeing of our community.
8. We act ethically.
9. We are responsible stewards.
Over the next decade, Georgia Tech will become an example of inclusive innovation, a leading technological research university of unmatched scale, relentlessly committed to serving the public good; breaking new ground in addressing the biggest local, national, and global challenges and opportunities of our time; making technology broadly accessible; and developing exceptional, principled leaders from all backgrounds ready to produce novel ideas and create solutions with real human impact.
Department Information
The Office of Institutional Research and Planning (IRP) at Georgia Tech is a research and analytics service unit dedicated to supporting the campus community. Our team of institutional research and data analytics professionals combines technical and creative skills to inform institutional strategic decision-making, planning, and research across campus. In addition to institutional reporting and compliance, IRP provides data education, support, and resources to all campus units.
Visit our website to learn more about what we do:
Job Summary
Data Analysts analyze data, interpret trends and patterns, and provide insights to support decision-making processes. They develop data models, perform data mining and statistical analysis, and collaborate with stakeholders to optimize data-driven strategies.
Responsibilities
Job Duty 1 -
Collect, analyze, and interpret data from various sources, databases, and systems to extract insights, trends, and patterns that inform business decisions, strategies, and operations.
Job Duty 2 -
Develop and maintain data models, queries, and reports using SQL, Python, R, or data analysis tools to perform data cleansing, transformation, and visualization tasks.
Job Duty 3 -
Identify data quality issues, anomalies, and discrepancies in datasets, conduct data validation, data profiling, and data integrity checks to ensure data accuracy and reliability.
Job Duty 4 -
Create data visualizations, dashboards, and data analytics reports to communicate data findings, trends, and key metrics to stakeholders, management, and decision-makers.
Job Duty 5 -
Conduct ad-hoc data analysis, exploratory data analysis, and statistical analysis to support decision-making processes, performance monitoring, and data-driven insights.
Job Duty 6 -
Perform data mining, predictive analytics, and machine learning tasks to uncover hidden patterns, predict outcomes, and drive data-driven decision-making in organizations.
Job Duty 7 -
Utilize data analytics tools, business intelligence platforms, and statistical software packages to conduct data analysis, data modeling, and data visualization tasks efficiently and accurately.
Job Duty 8 -
Stay current on data analytics trends, tools, and methodologies through training, certifications, and industry publications to enhance data analysis skills and knowledge.
Job Duty 9 -
Collaborate with business users, data scientists, and Information Technology teams to define data requirements, analytics requirements, and data-driven solutions for business problems and opportunities.
Job Duty 10 -
Perform other job-related duties as assigned.
Responsibilities
The Institutional Research Data Analyst will also be expected to perform various duties specific to institutional research, including but not limited to:
- Responding to intermediate to high difficulty/complexity ad-hoc data and analysis requests
- Adhering to federal, state, and institutional policies, regulations, and requirements related to data security, privacy, and governance
Completing or supporting the completion of externally-driven compliance and data-related reporting including
- Federal, e.g., IPEDS, NSF-HERD, NSF-GSS, etc.
- State, e.g., USG data collections, data requests, etc.
- Higher education organizations, e.g., AAUDE, SREB, NSC, accrediting bodies, etc.
Required Qualifications
Educational Requirements
Bachelor's Degree in related discipline or equivalent combination of education and experience. Advanced certification may be preferred or required (some profiles may require additional education).
Required Experience
Four or more years of relevant experience.
Proposed Salary
Annual Salary Range: $75,751 to $80,000
Knowledge, Skills, & Abilities
SKILLS
o Performs all the standard and technical aspects of the job
o Applies in-depth professional, technical, or industry knowledge to manage significantly complex
assignments/projects/programs
o Advanced knowledge of principles and practices of a particular field of specialization and Institute
policies, practices, and procedures
USG Core Values
The University System of Georgia is comprised of our 25 institutions of higher education and learning as well as the System Office. Our USG Statement of Core Values are Integrity, Excellence, Accountability, and Respect. These values serve as the foundation for all that we do as an organization, and each USG community member is responsible for demonstrating and upholding these standards. More details on the USG Statement of Core Values and Code of Conduct are available in USG Board Policy 8.2.18.1.2 and can be found on-line at policymanual/section8/C224/#p8.2.18_personnel_conduct.
Additionally, USG supports Freedom of Expression as stated in Board Policy 6.5 Freedom of Expression and Academic Freedom found on-line at policymanual/section6/C2653.
Equal Employment Opportunity
The Georgia Institute of Technology (Georgia Tech) is an Equal Employment Opportunity Employer. The Institute is committed to maintaining a fair and respectful environment for all. To that end, and in accordance with federal and state law, Board of Regents policy, and Institute policy, Georgia Tech provides equal opportunity to all faculty, staff, students, and all other members of the Georgia Tech community, including applicants for admission and/or employment, contractors, volunteers, and participants in institutional programs, activities, or services. Georgia Tech complies with all applicable laws and regulations governing equal opportunity in the workplace and in educational activities.
Equal opportunity and decisions based on merit are fundamental values of the University System of Georgia ("USG") and Georgia Tech. Georgia Tech prohibits discrimination, including discriminatory harassment, on the basis of an individual's race, ethnicity, ancestry, color, religion, sex (including pregnancy), national origin, age, disability, genetics, or veteran status in its programs, activities, employment, and admissions. Further, Georgia Tech prohibits citizenship status, immigration status, and national origin discrimination in hiring, firing, and recruitment, except where such restrictions are required in order to comply with law, regulation, executive order, or Attorney General directive, or where they are required by Federal, State, or local government contract.
Other Information
This is not a supervisory position.
This position does not have any financial responsibilities.
This position will not be required to drive.
This role is not considered a position of trust.
This position does not require a purchasing card (P-Card).
This position will not travel
This position does not require security clearance.
Background Check
Successful candidate must be able to pass a background check. Please visit employment/pre-employment-screening