Syniti Data Migration Tool Jobs in Usa
14,289 positions found
Job Description
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
The Boeing Commercial Airplane (BCA) Engineering Data Analytics Tool Team (BEDAT) is looking for a Engineering Analytics Analyst to assist in transforming the BCA Engineering Digital footprint in Everett, WA.
Primary Responsibilities:
Collect, analyze and implement technical requirements for key performance indicators and metrics in a Cognos based dashboard serving community of 1500 users
Design and support backend data source using MS SQL Server/Cognos, by extracting and staging data from 40 upstream databases, creating a single source authority for all BCA engineering related metrics and analytics
Develop ad-hoc queries, reports and analytical analysis through SQL, R and Tableau in collaboration with business partners to analyze emerging opportunities
Work closely with all levels of BCA Engineering leadership to understand the business and technical requirements
Google Cloud Platform familiarization
Leads cross-functional teams across multiple business processes
Ensures accurate deliverables and maintains results, and communicates to all participants
Collects, analyzes, documents, and integrates requirements from multiple process owners
Applies and makes recommendations for the process, data, and applications/systems architecture
May benchmark, or assist in benchmarking, best practices and industry standards; presents best practices at internal events
Learns to balance competing strategic initiatives
Conducts business requirements review, coordinates testing schedules, and assists in the preparation of test scripts
Communicates with information technology organizations to represent customers and functional users on project requirements, activities, and status
Serves as liaison to resolve business requirement issues between customer and information technology representatives
Demonstrates basic knowledge and use of Project Management and/or Program Management Best Practices tools necessary to assist clients working through the life cycle of an improvement project, Includes facilitating plan development
Seeks opportunities for company-wide synergy with practitioners of methods and tools from other skills or organizations
Assists with integration of remaining aspects of enterprise architecture (e.g. information, data, and applications architecture)
Ensures solution has architectural compliance and strategic alignment with business objectives
Leads, participates, or works together to reach agreement on the development of business architecture design, phased implementation, and use
Basic Qualifications (Required Skills/ Experience):
1 or more years’ experience with collecting, organizing, synthesizing, and analyzing data from multiple sources, summarizes findings; develops conclusions and recommendations from appropriate data sources.
1 or more years’ of experience utilizing and developing Analytical tools & code. ie. SQL, tableau, Cognos, teradata, cloud platforms etc.
Bachelors’ degree OR equivalent experience.
Preferred Qualifications (Desired Skills/Experience):
1 more years' experience with supporting multiple managers / leaders with developing strategic monthly, quarterly and yearly strategic plans.
1 or more years' experience working directly with executives or senior leaders
Drug Free Workplace:
Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies.
Pay & Benefits:
At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.
The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.
The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.
Pay is based upon candidate experience and qualifications, as well as market and business considerations.
Summary Pay Range:
Level 3 - $93,090 - $105,280
Applications for this position will be accepted until Mar. 23, 2026
Export Control Requirements:
This is not an Export Control position.
Education
Bachelor's Degree or Equivalent Required
Relocation
Relocation assistance is not a negotiable benefit for this position.
Visa Sponsorship
Employer will not sponsor applicants for employment visa status.
Shift
This position is for 1st shift
Equal Opportunity Employer:
Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
AI Data & Python Tools Engineer
We're seeking an AI Data and Python Tools Engineer to develop and deploy intelligent tools that leverage big data infrastructure and modern AI architecture. This role combines strong software engineering fundamentals with the ability to build production-ready AI applications at speed, including integration with Model Context Protocol (MCP) systems.
Responsibilities:
- Develop and deploy AI-powered full-stack applications using Python, React, and modern machine learning frameworks
- Design and streamline data pipelines, train and validate ML models, and implement robust evaluation methods
- Collaborate with cross-functional teams to solve complex problems and integrate scalable, cloud-based AI solutions
- Rapidly prototype, test, and iterate on AI tools with a strong focus on performance, flexibility, and scalability
- Maintain clear technical documentation, perform code reviews, and support the full software development lifecycle
Software Engineering & AI/ML Data, Tools Development
- 3+ years of Python Development with a background in back end services and data processing
- Exposure to AI/ML algorithms
- Familiarity with ML frameworks (TensorFlow, PyTorch, scikit-learn)
- Understanding of LLMs, vector databases, and retrieval systems
- Experience with Model Context Protocol (MCP) integration and server development
Big Data & Cloud Infrastructure
- Knowledge of building and deploying cloud based applications
- Hands-on experience with cloud data platforms (AWS/GCP/Azure)
- Proficiency with big data technologies (Spark, Kafka, or similar streaming platforms)
- Experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes
- Knowledge of containerization (Docker/Kubernetes) and infrastructure as code
*Preferred Experience
- Experience building web applications with modern frameworks (React, Vue, or Angular)
- API development and integration experience
- Basic UX/UI design sensibilities for internal tooling
- Experience with real-time data processing and analytics
- Background in building developer tools or internal platforms
- Familiarity with AI/ML operations (MLOps) practices (Experience using airflow)
- Experience building MCP servers and integrating with AI assistants
- Knowledge of structured data exchange protocols and API design for AI systems.
Type: Full Time
Location: Austin, TX or Cupertino, CA (Monday- Friday onsite)
*Relocation assistance can be offered based on individual needs and circumstances*
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
Resource 1 is in need of a Sr. Data Architect/ Modeler for a long-term contract in downtown Chicago. Our client requires 2 days/week onsite (Tues/Wed or Tues/Thurs), so candidates must be local.
The consultant will join a Dynamics 365 implementation project to lead data migration, Dataverse/ CDM data modeling, and analytics enablement for various business units. They will design and validate data models and guide migration/testing efforts.
Responsibilities:
- Lead data migration strategy and roadmap for the Dynamics 365 implementation.
- Design, document, and validate Dataverse/ CDM data models and entity relationships for D365 use cases.
- Own CDM‑centric data design: map legacy/ source systems to Dynamics 365 CE & F&O CDM entities, identify gaps, propose extensions, and document modeling decisions.
- Establish best practices for CDM usage, lineage, versioning, and review gates.
- Assess downstream impacts of model and migration changes and define additional data capture or retention needs.
- Guide and review ETL/pipeline implementation with data engineering team using Synapse/ ADF/ Fabric and ADLS Gen2.
- Participate in the gathering of reporting and operational requirements and translating them into data model and migration specifications.
Required Skills & Experience:
- Experience with Common Data Model (CDM) and Dynamics 365 (CE and/or F&O) data.
- Prior experience working on a Dynamics 365 migration project.
- Experience with Dataverse/ CDM data modeling.
- Familiarity with Microsoft data stack including Synapse, ADF, Fabric and ADLS Gen2.
- Ability to develop and execute test plans, validate migrations, and identify downstream reporting impacts.
Title: Data Migration Specialist
Location: San Diego, CA
Duration: 6-9 Month Assignment + Potential Extensions
Work Model: Fully On-Site
Pay rate: $21-23/hour
Start Date: March 16, 2026
JOB DESCRIPTION
One of our large CDMO clients is seeking data migration specialist to transition from a legacy Quality Management System (QMS) to MasterControl. This role focuses on extracting structured and unstructured data from the previous QMS platform, validating its accuracy, and entering and organizing information within MasterControl according to established procedures. The ideal candidate is detail‑oriented, highly organized, and comfortable working with quality documentation and regulated data environments.
Key Responsibilities
- Extract data from the legacy QMS, including documents, records, metadata, and historical logs.
- Review, clean, and validate extracted data to ensure accuracy, completeness, and compliance with internal standards.
- Input and upload data into MasterControl following defined workflows and naming conventions.
- Collaborate with Quality, IT, and Compliance teams to resolve discrepancies and clarify data requirements.
- Maintain detailed logs and status reports to track progress and identify issues during migration.
- Support testing and verification activities to ensure data integrity after import into MasterControl.
- Follow all SOPs, work instructions, and regulatory guidelines related to data handling and documentation control.
Required Skills and Experience:
- High School Diploma
- Experience with Data Entry or Document Control in a regulated environment
- Ability to type 40+ words per minute
- Microsoft Office proficiency
Nice to Have Skills & Experience:
- Experience with MasterControl
Compensation:
$21.00/hr to $23.00/hr.
Exact compensation may vary based on several factors, including skills, experience, and education.
Employees in this role will enjoy a comprehensive benefits package starting on day one of employment, including options for medical, dental, and vision insurance. Eligibility to enroll in the 401(k)-retirement plan begins after 90 days of employment. Additionally, employees in this role will have access to paid sick leave and other paid time off benefits as required under the applicable law of the worksite location.
Title:Sr. Manager Data Governance
Location: Richardson, TX Hybrid
Duration: 6 months possibility of FTE conversion yes
JOB SUMMARY
This position incubates and establishes a leading-edge global Data Governance function to support business segments, corporate functions and the Digital & Technology stakeholders. The responsibility includes
- Liaise directly with clients and account teams to provide strategic direction on implementation of data governance programs, best practices, adoption of standards, mast data management, and data quality improvement while leveraging leading-edge data governance tools and technology.
- Collaborate with and manage highly performing data governance and data management professionals that support occupier clients and account teams.
- Provide support on data strategy execution in the adoption of data products including enterprise data platform that provides game-changing analytics in the CRE industry.
- Serve as the data governance champion of strategic data products and supporting metadata and reference data.
- Implement and support data ownership and stewardship programs for stakeholders across the business to ensure that account teams adopt improved data governance and management practices.
ESSENTIAL DUTIES AND RESPONSIBILITIES
- Participate in the strategy, planning, and execution for Enterprise Data Governance at, focusing on Building Operations & Experience (BOE) business segment. Ensure the company has urgency, sensitivity and thought leadership for competitive capabilities around data.
- Demonstrated leadership experience in a large, complex, global organization, including the ability to effectively work and communicate across organizational lines. Ensure business stakeholder understanding, alignment and commitment to the objectives of the data governance and management program(s). Be the champion and evangelist for data, the business value, and the potential innovations. Be the trusted advisor to senior leadership and peers.
- Demonstrated experience in building relationships and leading high-performing teams with top talents around the world. Build a high performance environment and execute a people strategy that attracts, retains, develops and motivates their team by fostering an inclusive work environment, communicating vision/ values/ business strategy and managing succession and development planning for the team.
- Collaborate with partners across business segments/ business lines, regions and accounts to develop consistent data governance capabilities at all levels, influencing decisions relating to policy, practices, supporting technology, and talent development.
- Establish leading data management practices and shared services relating to data quality, data provisioning, metadata, lineage, reference data, issue management and change management.
- Implement data governance as commodity services that could be leveraged by various clients in different industries. Understand clients' appetite and risk culture in day-to-day support activities and decision-making.
- Establish account team data governance programs. Define data domains and implement business oversight via essential data governance organizations and RACI (i.e. central data governance function, Data Ownership and Stewardship Program, etc.). Establish data standards, policies and controls. Design and implement the framework, including associated processes, necessary to sustain a data control environment. Monitoring compliance with data policies and standards
- Establish account team and cross-account data quality framework necessary to enable data quality reporting, issue identification, remediation and tracking, ultimately ensuring trust and confidence in data across domains.
- Guide the client accounts to adopt the strategic data products including existing account migrations and new account transitions. Manage data to support and its clients' business intelligence and scale appropriately with business growth.
- Experience in leading and driving leading-edge data innovation initiatives including big data, cloud computing, IoT, data virtualization and federation, etc., is a plus.
- Create and implement strategic approaches, plans, timelines, preparation of business cases to ensure expedited handling of client data protection, and other data compliance and security requirements.
- Develop and implement metrics needed to monitor/ report on data governance and data management progress
- Develop communication approaches and change management strategies; determines presentation focus and emphasis and prepares board-level presentations.
- Performs other duties as assigned.
SUPERVISORY RESPONSIBILITIES
Manages the planning, organization, and controls for a major functional area or department. Position will be responsible for managing direct reports across the Americas region and working with peers across all regions, requiring flexibility in schedule. May also be responsible for matrix reports. This position requires subordinates' recommendations for staff recruitment, selection, promotion, advancement, corrective action and termination. Effectively recommends same for direct reports to next level management for review and approval. Monitors appropriate staffing levels and reports on utilization and deployment of human resources. Leads and supports staff in areas of staffing, selection, training, development, coaching, mentoring, measuring, appraising, and rewarding performance and retention. Leads by example and models behaviors that are consistent with the company's values.
QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
EDUCATION and EXPERIENCE
Bachelor's degree (BA/BS) from four-year college or university and a minimum of eight years of related experience and/or training, including five years of experience at the management level.
- 5 or more years of progressively responsible management positions in complex organizations required. Demonstrated success with high visibility projects, leaders in technology use and development, change management, budget and business case development and staff development.
- 5 or more years of related experience in related industry; commercial real estate management preferred.
- 7 or more years of data management related experience such as data analysis, data governance, enterprise information management, data modeling, and data quality management. analytics experience desired, i.e., data visualization, data analytics, data mining, business intelligence, etc.
- Candidates must have experience working in large organizations with geographically dispersed teams and complex technical environments.
- Experience in dealing with internal and external customers, service providers and vendors. Must be able to manage competing priorities. Needs to be resilient; resolving conflicts quickly to achieve desired business results.
- Bachelor's degree in business administration, Information Management, MIS, Business Intelligence and Data Science, Library Science, Computer Science or related fields; advanced degree preferred.
CERTIFICATES and/or LICENSES
None
COMMUNICATION SKILLS
- Ability to comprehend, analyze, and interpret the most complex business documents. Ability to respond effectively to the most sensitive issues. Ability to write reports, manuals, speeches and articles using distinctive style. Ability to make effective and persuasive presentations on complex topics to employees, clients, top management and/or public groups. Ability to motivate and negotiate effectively with key employees, top management, and client groups to take desired action.
- Ability to establish and maintain a high level of customer trust and confidence in the overall information and analytics space
- Excellent oral, written, and presentation communication skills. Strong negotiation and group facilitation skills; ability to move a process forward, while meeting the needs of a variety of clients.
- Excellent collaboration, influence and leadership skills. Ability to work with various levels of peers including analysts, developers and executives regarding complex business and data related issues.
- Relationship management skills that include excellent listening and consultative capability, the ability to influence and negotiate with business and technology partners to drive change, and the ability to take a broad perspective and make key connections
FINANCIAL KNOWLEDGE
- Requires basic knowledge of financial terms and principles.
- Participates in complex financial/business analysis and report reviews prepared peers or leaders.
- Manages to and oversees department budget.
REASONING ABILITY
- Ability to solve advanced problems and deal with a variety of options in complex situations. Requires expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based environment. Draws upon the analysis of others and makes recommendations that have a direct impact on the company.
- Understanding of global organizational design and the ability to shape and drive large-scale, cross-functional programs around people, technology, processes, and tools.
- Demonstrated ability to balance long-term strategy with quick wins.
- Demonstrated ability for strategic influencing and education of cross-functional stakeholders about the strategic importance and value of data governance
- Excellent managerial skills; collaborative, imaginative, resourceful, reliable, technically savvy.
- Superior analytical and creative problem-solving skills. Demonstrated successes in data analysis, drawing conclusions and improvement. Apply listening and consultative skills to understand business needs; be able to interpret requirements, identify impacts and analyze problems to determine impacts to business processes across the organization.
- Ability to work well under deadlines, ability to work in a multi-tasking production environment to make good judgments about competing priorities.
- Ability to tell a story to explain or sell a concept.
OTHER SKILLS and/or ABILITIES
- Utilizes an entrepreneurial approach and develops innovative solutions.
- Ability to write business cases, process maps, presentation materials and articles using distinctive style.
- Ability to make effective and persuasive presentations on complex topics across various levels of leadership
- Expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based multi-industry sales environment.
- Ability to use strong conceptual and analytical skills to generate insights and recommendations.
- Demonstrated information management and quantitative skills, including working knowledge of IT infrastructure, various technologies/ platforms, and aligned vendor solutions with enterprise strategic priorities.
- Experience managing small to mid-size teams and delivering results.
- Thorough knowledge of cutting-edge data management tools, industry advances, etc.
- Superior project management/ consulting and leadership skills. Demonstrated ability to facilitate complex, mission critical projects and to develop, participate in and guide multi-disciplinary work teams. Manage task timelines and deliverable schedules and share concerns about deliverables, timelines, and issues with Data Governance services or deliverables.
- Superior ability to manage, manipulate and analyze raw data, draw conclusions, and develop actionable recommendations using technology. Articulate the issues and resolutions via business-friendly communications. Serve as primary day-to-day contact for regional data management issues.
- Advanced understanding of data quality management. Knowledge of data governance and how it impacts business processes.
- Knowledge of master data management in a global environment, including data lifecycle and maintenance processes.
- Skills in MS Visio, Word and PowerPoint is a plus.
- Experience with reference data management tools at including Collibra, MS Excel, SQL query, etc., is a plus.
- Software development lifecycle knowledge, with background in agile philosophies and
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Job Overview:
Armas Pharmaceuticals is seeking a highly analytical and detail-oriented Business Data Analyst to join our growing team. This entry-level position is ideal for a recent graduate who is passionate about data analysis, business intelligence, and process improvement.
The analyst will play a key role in supporting the Finance and Operations teams by analyzing business data, preparing reporting for senior leadership, and helping improve internal data systems. The role requires strong Excel proficiency, analytical thinking, and the ability to translate raw data into actionable insights.
This position offers a unique opportunity to work cross-functionally while contributing to strategic decision-making and operational efficiency.
Key Responsibilities:
Data Analysis & Reporting
• Analyze business and sales data to identify trends, patterns, and performance insights.
• Prepare structured data sets and reports for senior management and leadership review.
• Support ongoing business reporting initiatives through data analysis and visualization.
Financial & Operational Support
• Assist the finance team with data preparation, analysis, and reporting processes.
• Support financial modeling and reporting through structured data organization.
• Ensure accuracy and consistency of financial and operational datasets.
Data Systems & Automation
• Assist in developing and improving automated reporting processes using Excel and related tools.
• Contribute to the expansion and maintenance of the company’s internal data warehouse.
• Identify opportunities to streamline workflows through automation and improved data structures.
Partner & Business Intelligence Support
• Prepare partner data and operational metrics to support leadership reporting.
• Work with cross-functional teams to compile and analyze key business metrics.
• Help develop scalable data processes that improve business visibility.
Innovation & Technology
• Explore opportunities to leverage AI tools and integrations to enhance business intelligence, automation, and reporting capabilities.
• Assist in identifying technologies that improve data accessibility and decision-making.
Qualifications:
Education
• Bachelor’s degree required in one of the following or related fields:
· Finance
· Economics
· Business Analytics
· Data Science
· Mathematics
· Information Systems
· Business Administration
Required Skills
• Advanced Microsoft Excel skills (pivot tables, advanced formulas, data analysis tools).
• Strong analytical and critical thinking ability.
• Ability to organize and interpret large datasets.
• Excellent attention to detail and data accuracy.
• Strong written and verbal communication skills.
• Ability to work independently while collaborating across departments.
Preferred Skills (Bonus)
• Experience with data visualization tools (Power BI, Tableau, etc.)
• Exposure to SQL or database systems
• Familiarity with data warehouse concepts
• Experience integrating or applying AI tools to business processes
Compensation & Benefits:
• Salary: $50,000 – $60,000 annually
• Performance-based bonus opportunity
• Full-time employee benefits including:
• Health insurance (after eligibility period)
• 401(k) retirement plan (after eligibility period)
• Paid Time Off (PTO)
• Participation in company incentive programs
About Armas Pharmaceuticals, Inc.
Armas Pharmaceuticals is a generic pharmaceutical company delivering an ever-growing portfolio of high-quality products that provide convenience, affordability, and consistency. Through the development of strong partnerships and industry relationships, Armas maintains this pipeline of products with customer and patient satisfaction in mind. Headquartered in Freehold, New Jersey, Armas Pharmaceuticals is always open to new partnership opportunities and ways to offer our customers the best products at affordable prices. For more information, please visit .
DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.
You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.
At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.
Key Responsibilities
Enterprise Reporting (Hands-On Development)
- Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
- Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
- Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
- Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
- Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
- Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
- Support reporting scalability as our client grows (new factories, new business units, new product lines).
Enterprise Reporting Standards & Governance
- Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
- Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
- Review and approve reporting change requests, data model modifications, and access requests.
- Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
- Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.
Enterprise Data Architecture
- Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
- Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
- Build and govern the centralized semantic model that powers reporting across the company.
- Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
- Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.
Leadership & Collaboration
- Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
- Communicate complex architectural concepts in clear, business-friendly terms.
- Lead cross-functional initiatives that require unified data structures or scalable reporting.
- Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.
Ideal Candidate Profile
- Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
- Track record of establishing and enforcing enterprise reporting standards and governance.
- Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
- Able to operate as both an individual contributor and a strategic leader.
- Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
- Comfortable influencing architectural decisions and guiding technical execution.
- Strong command of foundational tools and languages such as:
- DAX
- Power Query / M
- SQL
- Fabric pipelines / ETL tooling
- Experience with automation and AI-assisted analytics workflows.
LocationAtlanta, Georgia
Full/Part TimeFull-Time
Regular/TemporaryRegular
Add to Favorite JobsEmail this Job
About Us
Overview
Georgia Tech prides itself on its technological resources, collaborations, high-quality student body, and its commitment to building an outstanding and diverse community of learning, discovery, and creation. We strongly encourage applicants whose values align with our institutional values, as outlined in our Strategic Plan. These values include academic excellence, diversity of thought and experience, inquiry and innovation, collaboration and community, and ethical behavior and stewardship. Georgia Tech has policies to promote a healthy work-life balance and is aware that attracting faculty may require meeting the needs of two careers.
About Georgia Tech
Georgia Tech is a top-ranked public research university situated in the heart of Atlanta, a diverse and vibrant city with numerous economic and cultural strengths. The Institute serves more than 45,000 students through top-ranked undergraduate, graduate, and executive programs in engineering, computing, science, business, design, and liberal arts. Georgia Tech's faculty attracted more than $1.4 billion in research awards this past year in fields ranging from biomedical technology to artificial intelligence, energy, sustainability, semiconductors, neuroscience, and national security. Georgia Tech ranks among the nation's top 20 universities for research and development spending and No. 1 among institutions without a medical school.
Georgia Tech's Mission and Values
Georgia Tech's mission is to develop leaders who advance technology and improve the human condition. The Institute has nine key values that are foundational to everything we do:
1. Students are our top priority.
2. We strive for excellence.
3. We thrive on diversity.
4. We celebrate collaboration.
5. We champion innovation.
6. We safeguard freedom of inquiry and expression.
7. We nurture the wellbeing of our community.
8. We act ethically.
9. We are responsible stewards.
Over the next decade, Georgia Tech will become an example of inclusive innovation, a leading technological research university of unmatched scale, relentlessly committed to serving the public good; breaking new ground in addressing the biggest local, national, and global challenges and opportunities of our time; making technology broadly accessible; and developing exceptional, principled leaders from all backgrounds ready to produce novel ideas and create solutions with real human impact.
Department Information
The Office of Institutional Research and Planning (IRP) at Georgia Tech is a research and analytics service unit dedicated to supporting the campus community. Our team of institutional research and data analytics professionals combines technical and creative skills to inform institutional strategic decision-making, planning, and research across campus. In addition to institutional reporting and compliance, IRP provides data education, support, and resources to all campus units.
Visit our website to learn more about what we do:
Job Summary
Data Analysts analyze data, interpret trends and patterns, and provide insights to support decision-making processes. They develop data models, perform data mining and statistical analysis, and collaborate with stakeholders to optimize data-driven strategies.
Responsibilities
Job Duty 1 -
Collect, analyze, and interpret data from various sources, databases, and systems to extract insights, trends, and patterns that inform business decisions, strategies, and operations.
Job Duty 2 -
Develop and maintain data models, queries, and reports using SQL, Python, R, or data analysis tools to perform data cleansing, transformation, and visualization tasks.
Job Duty 3 -
Identify data quality issues, anomalies, and discrepancies in datasets, conduct data validation, data profiling, and data integrity checks to ensure data accuracy and reliability.
Job Duty 4 -
Create data visualizations, dashboards, and data analytics reports to communicate data findings, trends, and key metrics to stakeholders, management, and decision-makers.
Job Duty 5 -
Conduct ad-hoc data analysis, exploratory data analysis, and statistical analysis to support decision-making processes, performance monitoring, and data-driven insights.
Job Duty 6 -
Perform data mining, predictive analytics, and machine learning tasks to uncover hidden patterns, predict outcomes, and drive data-driven decision-making in organizations.
Job Duty 7 -
Utilize data analytics tools, business intelligence platforms, and statistical software packages to conduct data analysis, data modeling, and data visualization tasks efficiently and accurately.
Job Duty 8 -
Stay current on data analytics trends, tools, and methodologies through training, certifications, and industry publications to enhance data analysis skills and knowledge.
Job Duty 9 -
Collaborate with business users, data scientists, and Information Technology teams to define data requirements, analytics requirements, and data-driven solutions for business problems and opportunities.
Job Duty 10 -
Perform other job-related duties as assigned.
Responsibilities
The Institutional Research Data Analyst will also be expected to perform various duties specific to institutional research, including but not limited to:
- Responding to intermediate to high difficulty/complexity ad-hoc data and analysis requests
- Adhering to federal, state, and institutional policies, regulations, and requirements related to data security, privacy, and governance
Completing or supporting the completion of externally-driven compliance and data-related reporting including
- Federal, e.g., IPEDS, NSF-HERD, NSF-GSS, etc.
- State, e.g., USG data collections, data requests, etc.
- Higher education organizations, e.g., AAUDE, SREB, NSC, accrediting bodies, etc.
Required Qualifications
Educational Requirements
Bachelor's Degree in related discipline or equivalent combination of education and experience. Advanced certification may be preferred or required (some profiles may require additional education).
Required Experience
Four or more years of relevant experience.
Proposed Salary
Annual Salary Range: $75,751 to $80,000
Knowledge, Skills, & Abilities
SKILLS
o Performs all the standard and technical aspects of the job
o Applies in-depth professional, technical, or industry knowledge to manage significantly complex
assignments/projects/programs
o Advanced knowledge of principles and practices of a particular field of specialization and Institute
policies, practices, and procedures
USG Core Values
The University System of Georgia is comprised of our 25 institutions of higher education and learning as well as the System Office. Our USG Statement of Core Values are Integrity, Excellence, Accountability, and Respect. These values serve as the foundation for all that we do as an organization, and each USG community member is responsible for demonstrating and upholding these standards. More details on the USG Statement of Core Values and Code of Conduct are available in USG Board Policy 8.2.18.1.2 and can be found on-line at policymanual/section8/C224/#p8.2.18_personnel_conduct.
Additionally, USG supports Freedom of Expression as stated in Board Policy 6.5 Freedom of Expression and Academic Freedom found on-line at policymanual/section6/C2653.
Equal Employment Opportunity
The Georgia Institute of Technology (Georgia Tech) is an Equal Employment Opportunity Employer. The Institute is committed to maintaining a fair and respectful environment for all. To that end, and in accordance with federal and state law, Board of Regents policy, and Institute policy, Georgia Tech provides equal opportunity to all faculty, staff, students, and all other members of the Georgia Tech community, including applicants for admission and/or employment, contractors, volunteers, and participants in institutional programs, activities, or services. Georgia Tech complies with all applicable laws and regulations governing equal opportunity in the workplace and in educational activities.
Equal opportunity and decisions based on merit are fundamental values of the University System of Georgia ("USG") and Georgia Tech. Georgia Tech prohibits discrimination, including discriminatory harassment, on the basis of an individual's race, ethnicity, ancestry, color, religion, sex (including pregnancy), national origin, age, disability, genetics, or veteran status in its programs, activities, employment, and admissions. Further, Georgia Tech prohibits citizenship status, immigration status, and national origin discrimination in hiring, firing, and recruitment, except where such restrictions are required in order to comply with law, regulation, executive order, or Attorney General directive, or where they are required by Federal, State, or local government contract.
Other Information
This is not a supervisory position.
This position does not have any financial responsibilities.
This position will not be required to drive.
This role is not considered a position of trust.
This position does not require a purchasing card (P-Card).
This position will not travel
This position does not require security clearance.
Background Check
Successful candidate must be able to pass a background check. Please visit employment/pre-employment-screening
Excella is a transformative technology firm that helps organizations unlock new possibilities. We believe the key to helping clients challenge the status quo and reach new heights lies in our talented people. That’s why we’re committed to developing talent and providing opportunities for career growth at every stage. Join our collaborative team dedicated to solving complex problems with sustainable solutions while building your future as a leader. At Excella, you’re empowered to make lasting impact, turning today’s challenges into tomorrow’s mission successes.
- Workplace locations look different for everyone. Excellians are a distributed workforce and whether you're working from your home office or a client site, we support a flexible work/life integration regardless of your location.
- We offer top of industry medical, dental, and vision benefits with multiple options to choose from such as an employer-contributed health savings account, infertility coverage, and orthodontia so you can select the plan that works best for you.
- Regardless of what stage of life you’re in, Excella wants to support you. We provide 8 weeks of Parental Leave, discounted pet insurance, and a membership with 3 back-up emergency child or elder care days annually – all available to you on your first day.
- Starting day one, every employee is bonus eligible and receives 15 days of paid vacation, 6 federal holidays, and 4 floating holidays.
- Doing your best work means having the best tools! Excella’s TechEleX program provides you with multiple options to suit your technology needs. Choose between a variety of Mac or PC devices, and to ensure your hardware remains current, at the end of a 3-year period Excella will replace your existing computer with a new model from the program. Plus, we’ll even give you the original device to keep for your personal use!
- With Excella’s Annual Internet Reimbursement benefit, all employees receive an additional $25 per month to help offset the cost of internet access. This initiative reflects our commitment to supporting you in staying connected and productive, no matter where you work.
- We'll invest in your career by providing 3 days of paid professional development every year, including an allowance for registration fees to attend classes, conferences, or obtain professional certifications.
- We encourage mindfulness and overall well-being through employee wellness events, a HeadSpace membership, as well as access to TalkSpace and mental health coverage through our medical plans.
Overview
We are looking for a Data Analyst to join our team and deliver valuable customer experiences. Our analysts are team-oriented, collaborative, and focus on delivering value in everything we do. We use agile methods to analyze, define and document business requirements for software solutions which align with organizational goals and help our clients achieve their desired outcomes. We support our clients in their digital product needs – from vision to roadmap to execution.
Responsibilities
- Develop thorough understanding of the business context and objectives by, among other things, eliciting and analyzing requirements from all relevant stakeholders (business users, data engineers, data scientists, and similar).
- Able to operationalize business questions using data.
- Demonstrate understanding of relational databases, data architecture, and data modeling.
- Capable of building complex SQL statements to answer business questions.
- Experience designing conceptual data models/ entity relationship diagrams/ business process models.
- Able to transform and cleanse data using tools such as R, SQL, or Python.
- Able to interpret, understand, and explain data to clients.
- Basic knowledge of dashboard management and creation (drag and drop) for storytelling of regular information reports and ad-hoc requests.
- Build trust and respect, establish relationships, and develop rapport with technical and non-technical team members.
- Experience documenting Data Analysis processes and outcomes.
- Industry or subject matter expertise (e.g., finance, employment, hospitality, web analytics).
Qualifications
- B.A. or B.S. (focus in Computer Information Systems, Business Management, Engineering, or related area of study is a plus)
- Ability to obtain and maintain a Top Secret (TS) clearance and a Public Trust clearance is required.
- 3+ years’ experience in IT or related industry
- Prior consulting or client management experience preferred
- Intermediate/Advanced SQL skills
- Experience working with Data Visualization tools such as Tableau or Power BI
- Preferred experience with cloud analytics products such as Google Cloud, AWS, or Microsoft Azure
- Preferred knowledge of statistics and experience using statistical packages for analyzing datasets
- Strong analytical aptitude and ability to structure complex or undefined business problems
- Strong verbal and written communication skills; able to report and present findings to varying stakeholders
- Hands-on experience using tools to help clients make data-driven decisions
- Knowledge and practice of core Agile values and principles a plus
- Initiative to learn new strategies and trends to continue to educate self analytically, technically, and technologically
- Intermediate experience with Microsoft Excel and Power Point
- Understanding of DevOps Research and Assessment (DORA) and the capabilities within the DORA capability catalog is encouraged
Excella is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected veteran status, age, or any other characteristic protected by law. Excella is committed to providing access, equal opportunity, and reasonable accommodation for individuals with disabilities in employment, its services, programs, and activities. To request reasonable accommodation to participate in the job application or interview process, contact or 7
Job Description
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
The Boeing Defense, Space & Security (BDS) KC-46 Tanker Program is seeking a Product Lifecycle Management Engineer to perform change integration functions for new and derivative products and provide technical support in Configuration Status Accounting tasks such as As-Built to As-Design reconciliation. This is a BDS position located in Everett, Washington, and reports to the PLM Manager of the KC-46 Configuration and Data Management team.
The KC-46 is a high-visibility, leading-edge Commercial Derivative airplane program to support the United States Air Force and International Customers. As a Product Lifecycle Management Engineer, you will use Program Management Best Practices to oversee the definition and integration of configuration and data management tasks that span multiple engineering functions and airplane level engineering projects and processes as the KC-46 program transitions from Development into Production phases. This position will be 100% onsite in Everett, WA.
Position Responsibilities:
- Collaborates/Leads the development, analysis, management and compliance verification of process and product baselines of complex products
- Defines, plans, coordinates and conducts (or leads) product and subsystem level technical design reviews and audits for new and derivative products.
- Coordinates/leads the integration and control of the configuration of Software and Hardware product elements and analyzes & resolves issues with engineering product structure.
- Develops, integrates and implements (or leads) engineering technical program plans including impacts, risks and incorporation of lessons learned spanning multiple engineering functions.
- Applies knowledge of the interface and integration constraints for complex systems to identify and analyze hardware, software, product and system impacts to effectively define an integrated change proposal.
- Creates, reviews and manages software description and software configuration documents and artifacts. Prepare SW deliverables meeting contract and CDRL (Contract Data Requirement List)
- Ensure consistent application of Configuration and Data management policies, processes and program management best practices, and those process documents are current and accurate.
- Understands and able to interpret contract requirements, Contractor Data Requirements List (CDRL) content related to Product Life Cycle include Configuration Management, Data Management and Configuration Status Accounting (CSA).
This position will be 100% onsite in Everett, WA
This position requires the ability to obtain a US Security Clearance for which the US Government requires US Citizenship. An interim and/or final U.S. Secret Clearance Post-Start is required.
Basic Qualifications (Required Skills/Experience):
- Level 3: Bachelor's degree and typically 5 or more years' experience in an engineering classification or a Master's degree with typically 3 or more years' experience in an engineering classification
- Level 4: Bachelor's degree and typically 9 or more years' experience in an engineering classification or a Master's degree with typically 7 or more years' experience in an engineering classification
- Written and verbal communication skills with strong technical content.
- Experience and Knowledge of responsibilities and tasks performed by various Engineering Experience with Microsoft Office Applications; especially highly skillful with Excel as data process tool.
- Skills and ability to: collect, organize, synthesize, and analyze data; summarize findings; develop conclusions and recommendations from appropriate data sources.
- Knowledge of drawing/data systems (e.g., ""used on"" drawings, part relationships, product data management) and configuration management principles and processes (e.g., part number control, revision level, naming conventions, product identification numbering systems).
- Able to understand and interpret contract requirements, especially Contractor Data Requirements List (CDRL) content related to Configuration Status Accounting (CSA).
Preferred Qualifications:
- Knowledge of responsibilities and tasks performed by various Engineering departments/disciplines (e.g., design, test, software, technology, avionics). Knowledge of the interaction between departments/disciplines and how their products/processes affect one another and impact non-engineering processes (e.g., Operations, Logistics, Business).
- Familiar with relational database language and tools such as SQL Server, MySQL
- Familiar with application development languages and tools such as C-Sharp, HTML, CSS, JavaScript.
Relocation:
This position does not offer relocation.
Drug Free Workplace:
Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies.
Shift:
This position is for 1st shift. Occasional alternative shifts as needed.
Pay & Benefits:
At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.
The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.
The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.
Pay is based upon candidate experience and qualifications, as well as market and business considerations.
Summary Pay Range:
Level 3: $111,350-$150,650
Level 4: $135,150-$182,850
Applications for this position will be accepted until Mar. 20, 2026
Export Control Requirements:
This is not an Export Control position.
Education
Bachelor's Degree or Equivalent Required
Relocation
Relocation assistance is not a negotiable benefit for this position.
Security Clearance
This position requires the ability to obtain a U.S. Security Clearance for which the U.S. Government requires U.S. Citizenship. An interim and/or final U.S. Confidential Clearance Post-Start is required.
Visa Sponsorship
Employer will not sponsor applicants for employment visa status.
Shift
This position is for 1st shift
Equal Opportunity Employer:
Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Position title:
Project Scientist
Salary range:
The UC academic salary scales set the minimum pay determined by rank and step at appointment. See the following table for the current salary scale for this position: . A reasonable estimate for this position is $181,700 - $229,700.
Percent time:
100%
Anticipated start:
Winter/Spring 2026
Position duration:
Initial appointment is for one year with the possibility of renewal based on performance and funding availability.
Application Window
Open date: February 25, 2026
Next review date: Wednesday, Mar 11, 2026 at 11:59pm (Pacific Time)
Apply by this date to ensure full consideration by the committee.
Final date: Friday, Mar 27, 2026 at 11:59pm (Pacific Time)
Applications will continue to be accepted until this date, but those received after the review date will only be considered if the position has not yet been filled.
Position description
The Advanced BioImaging Center (ABC) in the Department of Molecular and Cell Biology at the University of California, Berkeley seeks applications for two Project Scientists at the Assistant, Associate, or full rank. The selected candidate will be appointed at the rank to commensurate with prior experience. The position will report to Professor Gokul Upadhyayula, with Professor Eric Betzig serving as an additional academic mentor. The project scientist will make significant and creative contributions in the area of machine learning & data analytics.
The Advanced BioImaging Center (ABC) at UC Berkeley aspires to be a world-leading multidisciplinary imaging center that drives important biological discoveries through critical new advances in all aspects of imaging technology and that drives the dissemination of that technology through a multi-pronged education strategy to scientists around the world. ABC was intentionally designed to maximize scientific productivity and impact by adopting groundbreaking imaging technologies such as the next-generation adaptive optical multifunctional microscope, incorporating the high-level technical expertise of instrumentation scientists, applied mathematicians, and computational scientists, and building worldwide collaborations aimed at tackling the challenges posed by terabyte and petabyte-scale imaging data processing, visualization, and dissemination. Members of the ABC have access to leading - edge imaging and computing hardware, as well as exposure to collaborators from a range of diverse disciplines, including in the fields of Artificial Intelligence, Data Science, Mathematics, and more.
The Assistant/Associate/Full Project Scientists will be an integral part of a visionary scientific team driving cutting-edge biological discoveries through immediate applications of critical advances in imaging technologies. These positions will work with a dedicated team to develop data analytics software in terabyte- to petabyte-scale imaging projects. The incumbents will develop and refine machine learning applications and manage projects and provide regular progress reports to PIs and collaborators.
Successful candidates will be an integral part of the expert team working together with computational scientists and biologists in experimental design to tackle complex biological questions in a quantitative manner. The work will primarily be conducted at the facility in Barker Hall. Occasional travel may be required.
Key Responsibilities
*Make significant and creative contributions to development of new imaging and data processing tools for datasets generated on multicellular tissues, organoids, transparent embryos.
*Design, build, and maintain new software packages for efficient data processing.
*Advise on applications of these tools for biological imaging; collaborate with Postdocs and graduate students on specific projects to test, learn and implement for general and specific use cases.
*General organization and management of software documentation.
*Bring cross disciplinary expertise to solve problems at the intersection between life science, computer vision, and state-of-the-art AI methods.
*Work with petabyte-scale light sheet datasets that are typically 4D or 5D (x,y,z,t,chemistry). Identify and implement scalable solutions to scientific questions on large-scale data sets, especially using performant algorithms.
*Develop machine learning approaches, computer vision tools to help pre-process dataset and annotations to generate groundtruth benchmarks.
*Contribute to dissemination via open source code repositories, demonstrations, publications, presentation.
These positions will be eligible for full benefits.
Lab:
Contract: ar-contract-2022/
Qualifications
Basic qualifications (required at time of application)
*PhD (or equivalent international degree)
Additional qualifications (required at time of start)
*Minimum of four years of postdoctoral research experience
*For consideration for the Associate Project Scientist rank: a minimum of 8 years of post PhD research experience
*For consideration for the full Project Scientist rank: a minimum of 14 years of post PhD research experience
Preferred qualifications
*PhD or equivalent international degree in Computation Data, Computer Sciences, Bioinformatics or Related field
*Demonstrated record of productivity and publications and/or scholarly contributions
*Strong biological background and understanding of molecular biology
*Demonstrate understanding of optical microscopy, including light sheet microscopy, adaptive optics, and modern scientific cameras
*Demonstrated ability to work in a research team, manage active collaborations with other academic groups
*Demonstrated experience handling and processing large scale imaging datasets (>100TB to petabyte scale and beyond)
*Expertise in programming in C++, Labview, MATLAB, Python
*Expertise in databases, data infrastructure, data governance
*Expertise in high performance computing using SLURM or LSF
*Experience with PyTorch, JAX, or Tensorflow
*Experience with NVIDIA CUDA and related OpenMP programming
*Experience with cloud services (AWS, GCP, Azure, etc)
*Experience with state of the art AI/ML architectures (vison transformers, diffusion models, etc
*Experience mentoring undergraduate/graduate students, and/or technicians.
*Experience with professional speaking engagements
*Ability to effectively communicate, participate in efficient and open collaboration, and engage with a diverse group of researchers
*The ideal candidate will be innovative and able to synergize various ideas and approaches, while exercising sound judgment to evaluate and take acceptable risks
Application Requirements
Document requirements
Curriculum Vitae - Your most recently updated C.V.
Cover Letter
Statement of Research - Provide a summary of your major research accomplishments in approximately 250 words. Additionally, please include a brief statement highlighting your experience that is directly relevant to the key responsibilities of this position
Project Portfolio - Summary portfolio of data and/or AI projects executed, as demonstrated by publications or github contributions
Reference requirements
- 3 required (contact information only)
Apply link:
JPF05256
Help contact:
About UC Berkeley
UC Berkeley is committed to diversity, equity, inclusion, and belonging in our public mission of research, teaching, and service, consistent with UC Regents Policy 4400 and University of California Academic Personnel policy (APM 210 1-d). These values are embedded in our Principles of Community, which reflect our passion for critical inquiry, debate, discovery and innovation, and our deep commitment to contributing to a better world. Every member of the UC Berkeley community has a role in sustaining a safe, caring and humane environment in which these values can thrive.
The University of California, Berkeley is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, or protected veteran status.
For more information, please refer to the University of California's Affirmative Action and Nondiscrimination in Employment Policy and the University of California's Anti-Discrimination Policy.
In searches when letters of reference are required all letters will be treated as confidential per University of California policy and California state law. Please refer potential referees, including when letters are provided via a third party (i.e., dossier service or career center), to the UC Berkeley statement of confidentiality prior to submitting their letter.
As a University employee, you will be required to comply with all applicable University policies and/or collective bargaining agreements, as may be amended from time to time. Federal, state, or local government directives may impose additional requirements.
Unless stated otherwise, unambiguously, in the position description, this position does not include sponsorship of a new consular H-1B visa petition that would require payment of the $100,000 supplemental fee.
As a condition of employment, the finalist will be required to disclose if they are subject to any final administrative or judicial decisions within the last seven years determining that they committed any misconduct.
- "Misconduct" means any violation of the policies or laws governing conduct at the applicant's previous place of employment, including, but not limited to, violations of policies or laws prohibiting sexual harassment, sexual assault, or other forms of harassment or discrimination, as defined by the employer.
- UC Sexual Violence and Sexual Harassment Policy
- UC Anti-Discrimination Policy
- APM - 035: Affirmative Action and Nondiscrimination in Employment
Job location
Berkeley, CA
Data Scientist Everfit | Hybrid, San Francisco Bay Area
About Everfit
Everfit is a fitness technology company building an AI-powered coaching platform that serves 280,000+ coaches and millions of training clients globally. We're transforming how fitness professionals deliver personalized training and nutrition guidance to their clients through intelligent automation and data-driven insights.
About the Role
We're looking for a senior data scientist who is passionate about fitness and energized by turning data into actionable insights that help coaches and their clients succeed. You'll play a critical role in understanding user behavior, product performance, and business metrics to inform strategic decisions as we scale our platform.
What You'll Do
Product Analytics & User Insights
- Define and track key product metrics (activation, engagement, retention, churn) to measure product health and success.
- Conduct cohort, funnel, and retention analyses to uncover behavioral insights and inform feature prioritization.
- Identify opportunities to improve onboarding, engagement, and coach–client interactions.
Experimentation & A/B Testing
- Own the experimentation framework and guide teams through hypothesis design, sample sizing, execution, and interpretation.
Strategic Impact & Roadmapping
- Collaborate with leadership to translate data insights into roadmap priorities and measurable business outcomes.
- Build predictive models and scenario analyses to support forecasting, pricing, and product investment decisions.
- Establish best practices in data instrumentation, dashboarding, and self-serve analytics across teams.
Technical Foundations
- Partner with data engineering to improve pipelines and instrumentation.
- Leverage tools such as SQL, Python/R, data visualization platforms, and experimentation platforms.
Marketing Analytics & Optimization
- Analyze customer acquisition funnels and marketing performance to identify high-impact opportunities for growth and conversion.
- Partner with marketing and growth teams to design and evaluate campaign experiments
What We're Looking For
- 4-6 years of experience in a data analyst or analytics role, preferably at a growth-stage tech company
- Strong proficiency in SQL and experience setting up data pipelines, transforming data, and analyzing large datasets
- Deep experience with creating dashboards and providing analysis on product analytics and data visualization tools (Amplitude, Looker, Tableau, Mode, or similar)
- Understanding of SaaS metrics and cohort analysis
- Experience with translating numerical findings into clear insights for non-technical team members
- Genuine passion for fitness, health, or wellness (we build for coaches so you need to understand their world)
Bonus Points:
- Experience with Python or R for statistical analysis
- Experience in a PLG (Product-Led Growth) environment
- Experience working at a company during a hypergrowth phase
- Background in fitness, health, wellness, or coaching industries
You'll thrive here if you:
- Are naturally curious and love asking "why" until you find the answer
- Are excited by fast-paced, high-growth environments with a passion for building out systems for scaling
- Enjoy collaborating with global teams and making complex topics understandable
- Are comfortable with ambiguity and can structure your own work
- Care deeply about the impact your insights have on real coaches and their clients
Why Join Everfit
- Establish the foundations for Fitness Intelligence and help shape the future of coaching for millions around the world
- Work with autonomy and ownership on high-impact projects
- Join a collaborative, global team with experience from leading tech and fitness companies
- Enjoy competitive salary, equity, and performance bonuses
- Build something meaningful that helps people live better, healthier lives
Everfit is an equal opportunity employer committed to building a diverse and inclusive team. We make employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability, or any other protected status.
Ready to dive into data in the fitness intelligence space? We'd love to hear from you.
Role - Lead Data Analyst
Location : Denver, CO [Local Only] In-person client interview
Job Summary
This role is responsible for extracting meaningful information and providing the business with actionable recommendations to drive outcomes. Responsible for leveraging existing data sources and creating new analysis methods.
Major Duties And Responsibilities
- Actively and consistently supports all efforts to simplify and enhance the customer experience.
- Lead client teams to define clear business requirements for data analysis projects.
- Provide metrics definition, data visualizations, and ETL requirements.
- Extract, clean and engineer data to be ready for analysis.
- Interpret data, formulate hypotheses and develop an analytical approach to meet business requirements
- Create customer-readable reports using advanced visualization tools such as Tableau, PowerBI, Excel, etc.
- Work to obtain and ingest new reference data sources required to deliver on business need.
- Communicate results and make recommendations using data visualization and presentations.
- Create analyses and dashboards that are usable, elegant and industry leading.
Required Qualifications
- Ability to read, write, speak and understand English
- Demonstrated in-depth ability to analyze, interpret and present data
- Demonstrated in-depth ability to make decisions and solve problems while working under pressure
- Demonstrated in-depth ability to prioritize and organize effectively
- Demonstrated mastery of advanced analytics processes and reporting design principles
- Demonstrated mastery in SQL, Python, or R
- Demonstrated in-depth proficiency of design and implementation practices within data visualization tools
- Effective communication skills, verbal and written, for internal and external customers
- Ability to communicate complex technical concepts to all levels of an organization to aid in decision-making
Required Education
- Bachelor's degree in Computer Science, Engineering or related field; or equivalent experience
Required Related Work Experience and Number of Years
- 7+ years’ experience working within a data platform/data analysis environment
- 7+ years’ experience in a customer facing products/services environment
- Technical Lead, exp in more industries, more expert than Data Insight Analysts
About Us
Perform Properties is a Blackstone Real Estate portfolio company focused on high-performing retail and office properties with People-Appeal - vibrant spaces where people actively choose to work, shop, and gather. With expertise in transactions, development, leasing, and management, the company oversees over 33 million square feet of retail and office properties across the U.S. Learn more: .
Role Summary
Our VP, Data & Analytics unlocks data-driven growth at the speed of natural language through AI-enabled execution across Perform Properties. An innovative architect with deep business literacy, the VP, Data & Analytics will lead our efforts to put data at the center of our Technology capabilities with a modern, performant and AI-ready data & analytics platform. This critical capability will serve a wide range of business functions, enabling Investments, Portfolio, Operations and Finance people to put AI, BI, Analytics and other emerging technologies to work for them every day – not just talk about the potential & possibilities.
This role reports to the Chief Technology Officer and is based in the office, 5 days a week.
Essential Job Functions
- Drive Data Architecture & Engineering excellence that actively reduces our Coordination Tax
- Build Data Modelling & Analytics capabilities to reduce our Time-to-Productivity
- Champion Artificial and Business Intelligence (AI / BI) capabilities through compelling next generation interactions (Visualization, Natural Language & Agentic) that reduce our Time-to-Insight
- Cultivate Data governance & stakeholder engagement that creates real shared ownership of our platform
- Model the successful use of AI as a capabilities & resource extension, not just a gimmick
- Develop individuals & teams of technologists in the Data & Analytics space as their leader
Qualifications and Technical Competencies
- 10+ years leading Data Science, Data Engineering, Analytics and/or AI / ML-focused teams
- 5-7 years managing agile projects (Scrum, Kanban, SAFe)
- 3-5 years managing people (direct reports, manager of managers)
- Demonstrable success working with modern data platforms (Databricks, Snowflake, BigQuery, RedShift, Synapse)
- Demonstrable success delivering AI / ML initiatives (Natural Language Processing, Predictive Modeling, Statistical Modeling)
- Advanced proficiency in common data engineering tools (R, Python, DBT, SQL, Azure Data Factory)
- Advanced proficiency in common visualization tools (Tableau, PowerBI)
- Bachelor’s Degree in Computer Science, Mathematics or relevant tertiary education
Benefits & Compensation
Benefits: The Company provides a variety of benefits to employees, including health insurance coverage, retirement savings plan, paid holidays and paid time off (PTO).
Base Salary Range: $225,00-$265,000. This represents the presently-anticipated low and high end of the Company’s base salary range for this position. Actual base salary range may vary based on various factors, including but not limited to location and experience.
The additional total direct compensation and benefits described above are subject to the terms and conditions of any governing plans, policies, practices, agreements, or other materials or documents as in effect from time to time, including but not limited to terms and conditions regarding eligibility.
Closing
EEO Statement
Our company is proud to be an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our employment decisions are based on individual qualifications, job requirements and business needs without regard to race, color, marital status, sex, sexual orientation, gender identity and/or expression, age, religion, disability, citizenship status, national origin, pregnancy, veteran status and or any other legally protected characteristics. We are committed to providing reasonable accommodations, if you need an accommodation to complete the application process, please email
#LI-Onsite
The Operations Data Analyst will play a crucial role in applying data analysis and reporting to support operational performance, production scheduling, and process improvements. This position will work closely with operations teams to assist in scheduling tasks as needed, while also owning analytics processes, reporting, and data-driven insights that improve productivity, quality, and throughput.
***This is 100% on-site, full-time position in Madison, IN. Hybrid or remote work is NOT available.
Qualifications
Technical Skills
- Strong SQL skills: ability to write complex queries, join large datasets, optimize performance, and produce reliable analytical outputs.
- Advanced Excel expertise: pivot tables, VLOOKUP/XLOOKUP, Power Query, macros/VBA a strong plus.
- Experience with data visualization tools (Power BI, Tableau, or similar) preferred.
- Comfortable working with large datasets and generating meaningful insights.
Production Experience
- Understanding of production planning, operations workflows, and scheduling concepts — ideally in a manufacturing/industrial environment.
- Prior work in supporting production operations with analytical tools or capacity planning.
Education
- Bachelor’s Degree preferred — Analytics, Industrial Engineering, Supply Chain, Data Science, Business Analytics, Mathematics, or related field.
- Equivalent experience with strong technical skills considered.
Preferred Qualifications
- Experience with manufacturing ERP/MRP systems.
- SQL Server, MySQL, PostgreSQL, or similar database experience.
- Familiarity with scheduling tools or production modules within ERP systems.
- Knowledge of Lean Manufacturing or Continuous Improvement methodologies.
***FOR IMMEDIATE CONSIDERATION, PLEASE SEND A COPY OF YOUR UPDATED RESUME.