Optum Senior Data Engineer Salary Jobs in Usa
11,357 positions found
Sr. Data Engineer (Hybrid)
Chicago, IL
The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.
At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.
We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.
We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.
As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.
RESPONSIBILITIES:
Data Engineering & AI Enablement
- Build and maintain scalable data pipelines and
ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
cases. - Implement best practice patterns for ingestion,
transformation, modeling, and orchestration within a modern lakehouse
environment (e.g., Databricks, Delta Lake, Azure Data Lake). - Develop highperformance
data models and curated datasets with strong attention to quality, usability,
and interoperability; create reusable engineering components and automation. - Collaborate with the Architecture Team, the Data
Platform Lead, and federated IT teams to optimize storage, compute, and
architectural patterns for performance and costefficiency. - Build model-ready data sets and feature
pipelines to support AI/ ML use cases; serve as a technical coordination point
supporting business units' AI-related infrastructure needs. - Collaborate with data scientists and AI Working
Group to operationalize models responsibly and maintain ongoing monitoring
signals.
Governance, Quality & Compliance
- Embed data governance, metadata standards,
lineage tracking, and quality controls directly into engineering workflows;
ensure technical implementation and alignment within engineering workflows. - Work with the Data Governance Lead and business
stakeholders to operationalize stewardship, classification, validation,
retention, and access standards. - Implement privacybydesign and securitybydesign
principles, ensuring compliance with internal policies and regulatory
obligations. - Maintain documentation for pipelines, datasets,
and transformations to support transparency and audit requirements.
Platform Reliability, Observability & Optimization
- Monitor and troubleshoot pipeline failures,
performance bottlenecks, data anomalies, and platformlevel issues. - Implement observability tooling, alerts,
logging, and dashboards to ensure endtoend reliability. - Support cost governance by optimizing compute
resources, refining job schedules, and advising on efficient architecture. - Collaborate with the Data Platform Lead on
scaling, configuration management, CI/CD pipelines, and environment management. - Collaborate with business units to understand
data needs, translate them into engineering requirements, and deliver
fit-for-purpose data solutions; share and apply best practices and emerging
technologies within assigned initiatives. - Work with IT Security and Legal/ Compliance to
ensure platform and datasets meet risk and regulatory standards.
Staff Management
- Lead, mentor, and provide management oversight
for staff. - Responsible for setting objectives, evaluating
employee performance, and fostering a collaborative team environment. - Responsible for developing staff knowledge and
skills to support career development.
May include other responsibilities as assigned
REQUIREMENTS:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
- 5+ years of experience in data engineering within cloud environments
- Experience in people management preferred.
- Demonstrated hands-on experience with modern data platforms (Databricks preferred).
- Proficiency in Python, SQL, and data
transformation frameworks. - Experience designing and operationalizing
ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
CI/CD processes. - Solid understanding of data modeling,
structured/unstructured data patterns, and schema design. - Experience implementing governance and quality
controls: metadata, lineage, validation, stewardship workflows. - Working knowledge of cloud architecture, IAM,
networking, and security best practices. - Demonstrated ability to collaborate across
technical and business teams. - Exposure to AI/ML engineering concepts, feature
stores, model monitoring, or MLOps patterns. - Experience with infrastructureascode
(Terraform, CloudFormation) or DevOps tooling.
The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.
This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.
We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.
THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION
Apply NowShare Save JobRemote working/work at home options are available for this role.
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
As a Senior Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats. This is an individual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Implement robust data infrastructure in AWS, using Spark with Scala
- Evolve our core data pipelines to efficiently scale for our massive growth
- Store data in optimal engines and formats
- Collaborate with our cross-functional teams to design data solutions that meet business needs
- Built out fault-tolerant batch and streaming pipelines
- Leverage and optimize AWS resources while designing for scale
- Collaborate closely with our Data Science and Product teams
- How we'll define success:
- Successful implementation of scalable and efficient data infrastructure
- Timely delivery and optimization of data assets and APIs
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
What we're looking for:
- Production data engineering experience
- Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Expertise in SQL for data manipulation and extraction
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
- Nice-to-Haves
- Experience in adtech
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data table formats like Apache Iceberg, Delta
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$123,696—$254,667 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
Job Title: Data Engineer
Location: 100% Remote
Employment Type: W2 Contract, 6 Month Contract with possibility of extension
Pay Rate: $50.00 – $55.00/hour
Role Overview:
BEPC is seeking a Data Engineer to support our client by designing, building, and optimizing scalable data pipelines and architectures. This role is ideal for a technically strong professional who thrives in a collaborative environment and enjoys working with large datasets, cloud platforms, and modern data technologies to drive business insights.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines for large-scale structured and unstructured data.
- Build and optimize data architectures, models, and database systems for performance and scalability.
- Develop data solutions using cloud platforms (AWS, Azure, or GCP).
- Collaborate with cross-functional teams to translate business needs into technical solutions.
- Ensure data quality, integrity, and security, especially with sensitive datasets.
- Integrate data from multiple sources including databases, APIs, and flat files.
- Support analytics and machine learning initiatives with clean, reliable datasets.
- Troubleshoot and resolve data pipeline and performance issues.
- Document systems, workflows, and processes for maintainability and knowledge sharing.
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 3+ years of experience in data engineering or similar roles.
- Strong experience with ETL processes and data pipeline development.
- Proficiency in SQL and Python.
- Experience with Databricks, Apache Spark, or similar big data tools.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Strong understanding of database design and optimization.
- Experience working with large-scale and distributed data systems.
- Advanced English communication skills.
Preferred Qualifications:
- Experience with real-time data processing or streaming technologies.
- Familiarity with industrial data systems (e.g., PLCs, LabVIEW).
- Exposure to machine learning workflows or data science collaboration.
- Knowledge of data governance and compliance standards.
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Job Summary for Azure Data Engineer:
We are seeking a Senior Data Engineer to join a dynamic data team focused on building and modernizing enterprise data platforms. This role combines hands-on engineering, platform support, and forward-looking architecture design, with an emphasis on mentoring junior team members and driving best practices.
Job Qualifications and Responsibilities for Azure Data Engineer:
Key Responsibilities
- Design, develop, and maintain scalable data warehouse and lakehouse architectures
- Implement and optimize Medallion Architecture (Bronze, Silver, Gold layers)
- Build robust data pipelines using Python as a primary language
- Ensure data observability, quality checks, and governance using modern tools
- Support existing data platform (currently on Microsoft Fabric) – ~50% of role
- Contribute to platform modernization strategy, evaluating and potentially implementing solutions like Databricks or Snowflake – ~50% of role
- Develop and maintain data catalogs and metadata management frameworks
- Collaborate with cross-functional teams to understand and deliver on data requirements
- Mentor junior engineers and promote engineering best practices
Required Qualifications
- Strong experience in:
- Data Warehousing concepts
- Lakehouse architecture
- Medallion Architecture
- Data Observability & Data Quality frameworks
- Data Cataloging tools and practices
- Proficiency in Python (primary development language)
- Hands-on experience with cloud platforms (Azure preferred; AWS acceptable with willingness to quickly learn Azure)
- Strong problem-solving and analytical skills
- Excellent communication and interpersonal skills
**Candidate must be willing to go into office 3 days a week**
Senior Full-Stack AI & Data Engineer – Contract
RBA is an established leader and trusted partner for enterprise and mid-size organizations seeking to transform their business through technology solutions. As a Digital and Technology consultancy, we combine strategic insight with technical expertise to deliver impactful, scalable solutions that align with business goals. We take pride in working with some of the most recognized companies in our market—while fostering a culture that blends challenging career opportunities with a collaborative, fun work environment.
We are seeking a Senior Full-Stack AI & Data Engineer to join our growing Data & AI practice, supporting a high-impact client. In this role, you will lead the design and development of end-to-end AI-powered applications that drive personalization, predictive analytics, and next-generation digital experiences.
You’ll partner with business stakeholders, product teams, and engineers to build production-grade AI solutions—from data pipelines and model development to APIs and user-facing applications. The ideal candidate brings deep expertise across the full stack, modern data platforms, and generative AI technologies, with a passion for solving complex business challenges through innovative solutions.
Responsibilities
- Design and develop end-to-end AI-powered applications, including backend APIs and user-facing interfaces, to enable scalable and intuitive AI solutions.
- Build and maintain robust APIs using technologies such as Node.js, NestJS, or FastAPI, and develop modern web applications using React or similar frameworks.
- Develop, fine-tune, and deploy machine learning models using frameworks such as PyTorch and Scikit-learn.
- Implement advanced generative AI solutions, including Retrieval-Augmented Generation (RAG) pipelines and multi-modal AI applications.
- Design and build agentic AI systems using frameworks such as LangChain, enabling multi-step reasoning, tool use, and automation.
- Architect and optimize end-to-end data pipelines (ETL/ELT) using Python, SQL, and orchestration tools such as Airflow.
- Manage and integrate data workflows within Snowflake, leveraging technologies such as Snowpark or Cortex.
- Implement monitoring and observability for AI systems, including tracking model performance, drift, latency, and reliability.
- Design and deploy cloud-native solutions using Docker, Kubernetes, and CI/CD pipelines across AWS, Azure, or GCP.
- Collaborate with business stakeholders to translate data into actionable insights and intelligent applications.
- Contribute to DevOps best practices, including infrastructure-as-code (Terraform) and automated testing.
- Mentor junior engineers and promote best practices in AI ethics, data governance, and code quality.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 5+ years of experience across full-stack development, including backend (Node.js/Python) and frontend frameworks (React or similar).
- Strong experience designing and building data pipelines and modern data platforms, including expertise in SQL and data modeling.
- Proven experience deploying AI/ML solutions in production environments, including MLOps and model lifecycle management.
- Hands-on experience with generative AI technologies, including LLMs, prompt engineering, and RAG architectures.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Strong understanding of DevOps practices, including CI/CD, containerization, and infrastructure-as-code (Terraform).
- Excellent communication skills and ability to work effectively in client-facing environments.
Preferred Qualifications
- Experience with Snowflake, including Snowpark, Cortex, or similar data platform capabilities.
- Experience building agent-based AI systems or working with frameworks such as LangChain.
- Familiarity with vector databases and semantic search architectures.
- Experience developing mobile applications using React Native or Flutter.
- Knowledge of mobile architecture, UI/UX principles, and API integration patterns.
- Experience deploying applications to Apple App Store or Google Play Store.
- Familiarity with security and authentication protocols, including OAuth2, biometric authentication, and secure data handling.
- Cloud or data platform certifications (AWS, Azure, GCP, Snowflake, or similar).
Leadership & Culture
- Demonstrate leadership through mentorship, technical guidance, and promoting engineering best practices.
- Balance innovation with pragmatism—able to work across cutting-edge AI solutions and foundational data engineering tasks.
- Thrive in a collaborative, fast-paced consulting environment with a strong focus on client impact and delivery excellence.
Job Description
At Boeing, we innovate and collaborate to make the world a better place. We're committed to fostering an environment for every teammate that's welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us.
The Boeing Company is looking for a Senior Digital Engineer – Full Stack & Systems Architecture to join our team in Charleston, SC; El Segundo, CA; Huntsville, AL; Mesa, AZ; Oklahoma City, OK, Philadelphia, PA; or Berkeley, MO.
Boeing Test & Evaluation (BT&E) generates enormous volumes of data, but data alone does not create insight. We are building a Digital Engineering capability focused on transforming test intent into reusable knowledge through intuitive applications, scalable systems, and thoughtful architecture.
As a Digital Engineer – Full Stack & Systems Architecture, you will sit at the intersection of engineering workflows, software systems, and cloud platforms. Your mission is to empower BT&E engineers to work differently by designing and delivering digital products that shorten feedback loops, reduce friction, and accelerate learning from every test event.
This is not a traditional backend or data‐engineering role. You will design end‐to‐end solutions, working from initial understanding of engineer needs, to shaping application architecture, to implementing full‐stack solutions that integrate data, automation, and cloud services. You will partner closely with BT&E, BCA, BDS, and Wisk/Autonomy engineers to modernize how test data is accessed, explored, and operationalized across Boeing.
If you enjoy systems thinking, building products engineers actually want to use, and architecting platforms that scale across programs and clouds, this role is for you.
Position Responsibilities:
Digital Product & Application Development
- Design and develop full stack applications that improve test and evaluation workflows, decision making, and engineering productivity
- Translate ambiguous engineering problems into clear digital solutions, balancing usability, performance, and scalability
- Develop front end and backend services using modern frameworks (e.g., React, Node.js, Python, .NET)
- Design and implement APIs and service interfaces that enable integration across test systems, analytics platforms, and enterprise tools
Systems Architecture & Cloud Engineering
- Architect end to end systems spanning applications, data services, and cloud infrastructure
- Evaluate and select cloud services across AWS and Azure based on cost, usability, scalability, and long term maintainability
- Implement infrastructure as code using Terraform, CloudFormation, ARM, or Bicep to support repeatable, secure deployments
- Design solutions that support multi cloud and hybrid environments as required by program needs
Data Enabled Engineering (as a Platform Capability)
- Design data models and storage solutions that support both transactional systems and analytical workloads
- Build and integrate data services that allow engineers to discover, explore, and reuse test data efficiently
- Collaborate with data scientists and analysts to enable analytics, visualization, and ML workflows without burdening users with infrastructure complexity
DevOps, Reliability & Security
- Build CI/CD pipelines to support rapid iteration, testing, and safe deployment of applications
- Apply SRE principles to ensure reliability, observability, and operational excellence
- Build and maintain observability capabilities—including logs, metrics, and traces—to enable rapid diagnosis, performance optimization, and reliable operation of digital engineering systems
- Partner with security and compliance teams to ensure solutions meet Boeing security, data governance, and regulatory requirements
- Contribute to operational documentation, runbooks, and continuous improvement efforts
Collaboration & Technical Leadership
- Work closely with engineers, product owners, and stakeholders to shape digital roadmaps and technical direction
- Influence architecture and design decisions across programs through systems thinking and engineering judgment
- Collaborate with peers and contribute to a growing Digital Engineering community within BT&E
Basic Qualifications (Required Skills/Experience):
- Bachelor of Science degree in Engineering, Engineering Technology (including Manufacturing Technology), Computer Science, Data Science, Mathematics, Physics, Chemistry or non-US equivalent qualifications directly related to the work statement
- 5+ years of experience developing full stack applications with modern frameworks
- Strong systems thinking skills with experience designing end to end software solutions
- Proficiency in one or more programming languages (JavaScript/TypeScript, Python, C#, or Go)
- Experience deploying and operating applications in cloud environments (Azure and/or AWS)
- Hands on experience with infrastructure as code (Terraform, CloudFormation, ARM/Bicep)
- Working knowledge of CI/CD pipelines, Git, Docker, and Linux
- Experience designing and working with relational databases (e.g., PostgreSQL), including schema design and performance considerations
- Familiarity with security best practices (IAM, secrets management, network controls)
Preferred Qualifications (Desired Skills/Experience):
- Experience designing developer platforms or internal engineering tools
- Background in Digital Engineering, Model Based Systems Engineering (MBSE), or engineering workflow automation
- Cloud certifications (AWS and/or Azure)
- Experience with Kubernetes, serverless architectures, or event driven systems
- Exposure to data pipelines, analytics platforms, or data enabled applications
- Experience working in regulated or safety critical environments
- Understanding of aerospace, test & evaluation, or large scale engineering programs
- Familiarity with ITAR, EAR, DFARS, or similar compliance frameworks
Drug Free Workplace:
Boeing is a Drug Free Workplace where post offer applicants and employees are subject to testing for marijuana, cocaine, opioids, amphetamines, PCP, and alcohol when criteria is met as outlined in our policies.
Conflict of Interest:
Successful candidates for this job must satisfy the Company's Conflict of Interest (COI) assessment process.
Pay & Benefits:
At Boeing, we strive to deliver a Total Rewards package that will attract, engage and retain the top talent. Elements of the Total Rewards package include competitive base pay and variable compensation opportunities.
The Boeing Company also provides eligible employees with an opportunity to enroll in a variety of benefit programs, generally including health insurance, flexible spending accounts, health savings accounts, retirement savings plans, life and disability insurance programs, and a number of programs that provide for both paid and unpaid time away from work.
The specific programs and options available to any given employee may vary depending on eligibility factors such as geographic location, date of hire, and the applicability of collective bargaining agreements.
Pay is based upon candidate experience and qualifications, as well as market and business considerations.
Summary Pay Range: $127,500 – $197,800
Applications for this position will be accepted until Mar. 21, 2026
Export Control Requirements:
This position must meet U.S. export control compliance requirements. To meet U.S. export control compliance requirements, a "U.S. Person" as defined by 22 C.F.R. §120.62 is required. "U.S. Person" includes U.S. Citizen, U.S. National, lawful permanent resident, refugee, or asylee.
Export Control Details:
US based job, US Person required
Education
Bachelor's Degree or Equivalent Required
Relocation
Relocation assistance is not a negotiable benefit for this position.
Visa Sponsorship
Employer will not sponsor applicants for employment visa status.
Shift
This position is for 1st shift
Equal Opportunity Employer:
Boeing is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military/veteran status or other characteristics protected by law.
Senior Analytics Engineer
Overview
A rapidly growing consumer products company is seeking a Senior Analytics Engineer to help build and scale a modern data platform. This role sits at the intersection of analytics engineering, data infrastructure, and business intelligence, enabling teams across the organization to make data-driven decisions.
The company operates a U.S.-based manufacturing environment and a strong direct-to-consumer ecommerce platform. As the organization continues to scale, the data function is being built from the ground up, creating an opportunity for a hands-on engineer to shape the architecture, pipelines, and analytics capabilities of the business.
Responsibilities
Data Platform Development
- Build, maintain, and optimize data models using SQL and DBT
- Support migration and development of a centralized data warehouse environment
- Design scalable data architecture and transformation layers
- Improve reliability, performance, and maintainability of analytics infrastructure
Data Pipeline Engineering
- Develop and maintain ETL/ELT pipelines using modern data tools
- Expand and optimize ingestion pipelines from operational systems
- Write custom workflows and integrations using Python
- Ensure data quality, monitoring, and pipeline stability
Business Intelligence & Analytics
- Develop and maintain dashboards and reporting solutions
- Enable self-service analytics for business teams
- Work directly with stakeholders to translate business needs into data solutions
- Support analytics across key functions including:
- Supply chain
- Ecommerce performance
- Marketing analytics
- Sales performance
- Forecasting and operations
Data Governance & Reliability
- Establish trusted datasets and consistent data definitions
- Improve data documentation and discoverability
- Troubleshoot data issues and analytics requests across teams
- Ensure long-term scalability of the analytics ecosystem
Required Qualifications
- 4+ years of experience working with SQL
- 4+ years of experience using DBT
- 4+ years of experience building dashboards and BI solutions
- Experience building and managing data pipelines and ETL workflows
- Strong understanding of data warehousing concepts
- Ability to work independently in a fast-paced, evolving environment
- Strong communication skills and experience collaborating with non-technical stakeholders
Preferred Qualifications
- Experience working with BigQuery
- Experience building dashboards in Looker
- Pythonfor data workflows or ingestion pipelines
- Experience with ecommerce analytics
- Experience analyzing Shopify or similar commerce platforms
- Experience working with manufacturing or supply chain data
Ideal Candidate Background
Strong candidates often come from:
- Ecommerce organizations
- Manufacturing companies
- Businesses operating direct-to-consumer sales models
- Mid-sized companies where individuals have broad ownership of the data stack
Experience analyzing
- Ecommerce sales performance
- Supply chain operations
- Marketing attribution
- Product and operational data
Work Environment
- Hybrid work model with 2–3 days per week in office
- Collaboration with a small technical team including IT and data science
- Fast-paced environment with significant opportunity to influence the company’s data strategy
- High level of autonomy and ownership over technical solutions
What We're Looking For
- Curious and evidence-driven
- Comfortable working with ambiguity
- Self-directed and proactive
- Passionate about learning new technologies
- A strong problem solver who enjoys building scalable systems
Job Summary:
Our client is seeking a Senior Data Analytics Engineer (Customer Data) to join their team! This position is located in Irving, Texas.
Duties:
- Support cross-functional teams including Marketing, Data Science, Product, and Digital
- Build datasets that power: customer segmentation, personalization workflows, campaign and lifecycle analytics, BI dashboards and KPIs and real-time and ML-driven customer experiences
- Build, optimize, and maintain customer data pipelines using PySpark/Databricks
- Transform raw customer data into analytics‑ready datasets for reporting, segmentation, personalization, and AI/ML applications
- Develop customer behavior metrics, campaign insights, and lifecycle reporting layers
- Design datasets used by Power BI/Tableau; dashboard creation is a plus, not required
- Optimize Databricks performance such as: skewed joins, partitioning, sorting, caching/persist strategy
- Work across AWS/Azure/GCP and integrate pipelines with CDPs
- Participate in ingestion and digestion phases to shape MarTech and BI analytical layers
- Document and uphold data engineering standards, governance, and best practices across teams
Desired Skills/Experience:
- 6+ years in Data Engineering or Analytics Engineering
- Strong hands-on experience with: Databricks, PySpark, Python and SQL
- Proven experience with customer/marketing data: segmentation, personalization, campaign analytics, retention, behavioral metrics
- Ability to design performance‑optimized pipelines; batch or near real-time
- Experience building datasets consumed by Power BI/Tableau
- Understanding of CDP workflows, customer identity data, traits/feature modeling, and activation
- Strong communication skills, translating marketing needs into technical data solutions
- Power BI expertise, major plus
- Experience with Delta Lake, orchestration, or feature engineering for ML
- Background as an Analytics Engineer, BI/Data Modeling Engineer, or Data Engineer with strong analytics orientation
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $140,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at