Exl Senior Data Engineer Salary Jobs in Usa
11,177 positions found
Sr. Data Engineer (Hybrid)
Chicago, IL
The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.
At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.
We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.
We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.
As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.
RESPONSIBILITIES:
Data Engineering & AI Enablement
- Build and maintain scalable data pipelines and
ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
cases. - Implement best practice patterns for ingestion,
transformation, modeling, and orchestration within a modern lakehouse
environment (e.g., Databricks, Delta Lake, Azure Data Lake). - Develop highperformance
data models and curated datasets with strong attention to quality, usability,
and interoperability; create reusable engineering components and automation. - Collaborate with the Architecture Team, the Data
Platform Lead, and federated IT teams to optimize storage, compute, and
architectural patterns for performance and costefficiency. - Build model-ready data sets and feature
pipelines to support AI/ ML use cases; serve as a technical coordination point
supporting business units' AI-related infrastructure needs. - Collaborate with data scientists and AI Working
Group to operationalize models responsibly and maintain ongoing monitoring
signals.
Governance, Quality & Compliance
- Embed data governance, metadata standards,
lineage tracking, and quality controls directly into engineering workflows;
ensure technical implementation and alignment within engineering workflows. - Work with the Data Governance Lead and business
stakeholders to operationalize stewardship, classification, validation,
retention, and access standards. - Implement privacybydesign and securitybydesign
principles, ensuring compliance with internal policies and regulatory
obligations. - Maintain documentation for pipelines, datasets,
and transformations to support transparency and audit requirements.
Platform Reliability, Observability & Optimization
- Monitor and troubleshoot pipeline failures,
performance bottlenecks, data anomalies, and platformlevel issues. - Implement observability tooling, alerts,
logging, and dashboards to ensure endtoend reliability. - Support cost governance by optimizing compute
resources, refining job schedules, and advising on efficient architecture. - Collaborate with the Data Platform Lead on
scaling, configuration management, CI/CD pipelines, and environment management. - Collaborate with business units to understand
data needs, translate them into engineering requirements, and deliver
fit-for-purpose data solutions; share and apply best practices and emerging
technologies within assigned initiatives. - Work with IT Security and Legal/ Compliance to
ensure platform and datasets meet risk and regulatory standards.
Staff Management
- Lead, mentor, and provide management oversight
for staff. - Responsible for setting objectives, evaluating
employee performance, and fostering a collaborative team environment. - Responsible for developing staff knowledge and
skills to support career development.
May include other responsibilities as assigned
REQUIREMENTS:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
- 5+ years of experience in data engineering within cloud environments
- Experience in people management preferred.
- Demonstrated hands-on experience with modern data platforms (Databricks preferred).
- Proficiency in Python, SQL, and data
transformation frameworks. - Experience designing and operationalizing
ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
CI/CD processes. - Solid understanding of data modeling,
structured/unstructured data patterns, and schema design. - Experience implementing governance and quality
controls: metadata, lineage, validation, stewardship workflows. - Working knowledge of cloud architecture, IAM,
networking, and security best practices. - Demonstrated ability to collaborate across
technical and business teams. - Exposure to AI/ML engineering concepts, feature
stores, model monitoring, or MLOps patterns. - Experience with infrastructureascode
(Terraform, CloudFormation) or DevOps tooling.
The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.
This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.
We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.
THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION
Apply NowShare Save JobRemote working/work at home options are available for this role.
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
As a Senior Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats. This is an individual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Implement robust data infrastructure in AWS, using Spark with Scala
- Evolve our core data pipelines to efficiently scale for our massive growth
- Store data in optimal engines and formats
- Collaborate with our cross-functional teams to design data solutions that meet business needs
- Built out fault-tolerant batch and streaming pipelines
- Leverage and optimize AWS resources while designing for scale
- Collaborate closely with our Data Science and Product teams
- How we'll define success:
- Successful implementation of scalable and efficient data infrastructure
- Timely delivery and optimization of data assets and APIs
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
What we're looking for:
- Production data engineering experience
- Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Expertise in SQL for data manipulation and extraction
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
- Nice-to-Haves
- Experience in adtech
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data table formats like Apache Iceberg, Delta
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$123,696—$254,667 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
Job Title: Data Engineer
Location: 100% Remote
Employment Type: W2 Contract, 6 Month Contract with possibility of extension
Pay Rate: $50.00 – $55.00/hour
Role Overview:
BEPC is seeking a Data Engineer to support our client by designing, building, and optimizing scalable data pipelines and architectures. This role is ideal for a technically strong professional who thrives in a collaborative environment and enjoys working with large datasets, cloud platforms, and modern data technologies to drive business insights.
Key Responsibilities:
- Design, develop, and maintain ETL pipelines for large-scale structured and unstructured data.
- Build and optimize data architectures, models, and database systems for performance and scalability.
- Develop data solutions using cloud platforms (AWS, Azure, or GCP).
- Collaborate with cross-functional teams to translate business needs into technical solutions.
- Ensure data quality, integrity, and security, especially with sensitive datasets.
- Integrate data from multiple sources including databases, APIs, and flat files.
- Support analytics and machine learning initiatives with clean, reliable datasets.
- Troubleshoot and resolve data pipeline and performance issues.
- Document systems, workflows, and processes for maintainability and knowledge sharing.
Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or related field.
- 3+ years of experience in data engineering or similar roles.
- Strong experience with ETL processes and data pipeline development.
- Proficiency in SQL and Python.
- Experience with Databricks, Apache Spark, or similar big data tools.
- Hands-on experience with cloud platforms (AWS, Azure, or GCP).
- Strong understanding of database design and optimization.
- Experience working with large-scale and distributed data systems.
- Advanced English communication skills.
Preferred Qualifications:
- Experience with real-time data processing or streaming technologies.
- Familiarity with industrial data systems (e.g., PLCs, LabVIEW).
- Exposure to machine learning workflows or data science collaboration.
- Knowledge of data governance and compliance standards.
Job Description Summary
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be instrumental in designing, building, and maintaining robust and scalable data pipelines and solutions within the Microsoft Azure ecosystem. You will be responsible for developing and optimizing ETL/ELT processes, ensuring data quality, and enabling efficient data access for analytics and business intelligence. We are looking for a hands-on engineer who thrives in a fast-paced environment and is passionate about leveraging cutting-edge technologies
Key Responsibilities:
Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows.
Build and optimize data architectures to support structured and unstructured data processing.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.
Implement data quality, security, and governance best practices.
Monitor and troubleshoot data workflows to ensure high availability and performance.
Optimize database and data storage solutions for performance and cost efficiency.
Contribute to cloud adoption, migration, and modernization initiatives.
Mandatory Skills:
Strong expertise with Azure cloud platform.
Strong experience in Databricks
Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built)
Hands-on experience with ETL/ELT tools and frameworks.
Proficiency in SQL, Python, and data modeling.
Knowledge of CI/CD pipelines and infrastructure-as-code tools.
Understanding of data governance, security, and compliance.
Preferred Skills:
Exposure to API integration and microservices architecture.
Strong analytical and problem-solving skills.
Azure cloud certifications and/or past experience
AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Job Summary for Azure Data Engineer:
We are seeking a Senior Data Engineer to join a dynamic data team focused on building and modernizing enterprise data platforms. This role combines hands-on engineering, platform support, and forward-looking architecture design, with an emphasis on mentoring junior team members and driving best practices.
Job Qualifications and Responsibilities for Azure Data Engineer:
Key Responsibilities
- Design, develop, and maintain scalable data warehouse and lakehouse architectures
- Implement and optimize Medallion Architecture (Bronze, Silver, Gold layers)
- Build robust data pipelines using Python as a primary language
- Ensure data observability, quality checks, and governance using modern tools
- Support existing data platform (currently on Microsoft Fabric) – ~50% of role
- Contribute to platform modernization strategy, evaluating and potentially implementing solutions like Databricks or Snowflake – ~50% of role
- Develop and maintain data catalogs and metadata management frameworks
- Collaborate with cross-functional teams to understand and deliver on data requirements
- Mentor junior engineers and promote engineering best practices
Required Qualifications
- Strong experience in:
- Data Warehousing concepts
- Lakehouse architecture
- Medallion Architecture
- Data Observability & Data Quality frameworks
- Data Cataloging tools and practices
- Proficiency in Python (primary development language)
- Hands-on experience with cloud platforms (Azure preferred; AWS acceptable with willingness to quickly learn Azure)
- Strong problem-solving and analytical skills
- Excellent communication and interpersonal skills
**Candidate must be willing to go into office 3 days a week**
Senior Full-Stack AI & Data Engineer – Contract
RBA is an established leader and trusted partner for enterprise and mid-size organizations seeking to transform their business through technology solutions. As a Digital and Technology consultancy, we combine strategic insight with technical expertise to deliver impactful, scalable solutions that align with business goals. We take pride in working with some of the most recognized companies in our market—while fostering a culture that blends challenging career opportunities with a collaborative, fun work environment.
We are seeking a Senior Full-Stack AI & Data Engineer to join our growing Data & AI practice, supporting a high-impact client. In this role, you will lead the design and development of end-to-end AI-powered applications that drive personalization, predictive analytics, and next-generation digital experiences.
You’ll partner with business stakeholders, product teams, and engineers to build production-grade AI solutions—from data pipelines and model development to APIs and user-facing applications. The ideal candidate brings deep expertise across the full stack, modern data platforms, and generative AI technologies, with a passion for solving complex business challenges through innovative solutions.
Responsibilities
- Design and develop end-to-end AI-powered applications, including backend APIs and user-facing interfaces, to enable scalable and intuitive AI solutions.
- Build and maintain robust APIs using technologies such as Node.js, NestJS, or FastAPI, and develop modern web applications using React or similar frameworks.
- Develop, fine-tune, and deploy machine learning models using frameworks such as PyTorch and Scikit-learn.
- Implement advanced generative AI solutions, including Retrieval-Augmented Generation (RAG) pipelines and multi-modal AI applications.
- Design and build agentic AI systems using frameworks such as LangChain, enabling multi-step reasoning, tool use, and automation.
- Architect and optimize end-to-end data pipelines (ETL/ELT) using Python, SQL, and orchestration tools such as Airflow.
- Manage and integrate data workflows within Snowflake, leveraging technologies such as Snowpark or Cortex.
- Implement monitoring and observability for AI systems, including tracking model performance, drift, latency, and reliability.
- Design and deploy cloud-native solutions using Docker, Kubernetes, and CI/CD pipelines across AWS, Azure, or GCP.
- Collaborate with business stakeholders to translate data into actionable insights and intelligent applications.
- Contribute to DevOps best practices, including infrastructure-as-code (Terraform) and automated testing.
- Mentor junior engineers and promote best practices in AI ethics, data governance, and code quality.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- 5+ years of experience across full-stack development, including backend (Node.js/Python) and frontend frameworks (React or similar).
- Strong experience designing and building data pipelines and modern data platforms, including expertise in SQL and data modeling.
- Proven experience deploying AI/ML solutions in production environments, including MLOps and model lifecycle management.
- Hands-on experience with generative AI technologies, including LLMs, prompt engineering, and RAG architectures.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Strong understanding of DevOps practices, including CI/CD, containerization, and infrastructure-as-code (Terraform).
- Excellent communication skills and ability to work effectively in client-facing environments.
Preferred Qualifications
- Experience with Snowflake, including Snowpark, Cortex, or similar data platform capabilities.
- Experience building agent-based AI systems or working with frameworks such as LangChain.
- Familiarity with vector databases and semantic search architectures.
- Experience developing mobile applications using React Native or Flutter.
- Knowledge of mobile architecture, UI/UX principles, and API integration patterns.
- Experience deploying applications to Apple App Store or Google Play Store.
- Familiarity with security and authentication protocols, including OAuth2, biometric authentication, and secure data handling.
- Cloud or data platform certifications (AWS, Azure, GCP, Snowflake, or similar).
Leadership & Culture
- Demonstrate leadership through mentorship, technical guidance, and promoting engineering best practices.
- Balance innovation with pragmatism—able to work across cutting-edge AI solutions and foundational data engineering tasks.
- Thrive in a collaborative, fast-paced consulting environment with a strong focus on client impact and delivery excellence.
Senior Analytics Engineer
Overview
A rapidly growing consumer products company is seeking a Senior Analytics Engineer to help build and scale a modern data platform. This role sits at the intersection of analytics engineering, data infrastructure, and business intelligence, enabling teams across the organization to make data-driven decisions.
The company operates a U.S.-based manufacturing environment and a strong direct-to-consumer ecommerce platform. As the organization continues to scale, the data function is being built from the ground up, creating an opportunity for a hands-on engineer to shape the architecture, pipelines, and analytics capabilities of the business.
Responsibilities
Data Platform Development
- Build, maintain, and optimize data models using SQL and DBT
- Support migration and development of a centralized data warehouse environment
- Design scalable data architecture and transformation layers
- Improve reliability, performance, and maintainability of analytics infrastructure
Data Pipeline Engineering
- Develop and maintain ETL/ELT pipelines using modern data tools
- Expand and optimize ingestion pipelines from operational systems
- Write custom workflows and integrations using Python
- Ensure data quality, monitoring, and pipeline stability
Business Intelligence & Analytics
- Develop and maintain dashboards and reporting solutions
- Enable self-service analytics for business teams
- Work directly with stakeholders to translate business needs into data solutions
- Support analytics across key functions including:
- Supply chain
- Ecommerce performance
- Marketing analytics
- Sales performance
- Forecasting and operations
Data Governance & Reliability
- Establish trusted datasets and consistent data definitions
- Improve data documentation and discoverability
- Troubleshoot data issues and analytics requests across teams
- Ensure long-term scalability of the analytics ecosystem
Required Qualifications
- 4+ years of experience working with SQL
- 4+ years of experience using DBT
- 4+ years of experience building dashboards and BI solutions
- Experience building and managing data pipelines and ETL workflows
- Strong understanding of data warehousing concepts
- Ability to work independently in a fast-paced, evolving environment
- Strong communication skills and experience collaborating with non-technical stakeholders
Preferred Qualifications
- Experience working with BigQuery
- Experience building dashboards in Looker
- Pythonfor data workflows or ingestion pipelines
- Experience with ecommerce analytics
- Experience analyzing Shopify or similar commerce platforms
- Experience working with manufacturing or supply chain data
Ideal Candidate Background
Strong candidates often come from:
- Ecommerce organizations
- Manufacturing companies
- Businesses operating direct-to-consumer sales models
- Mid-sized companies where individuals have broad ownership of the data stack
Experience analyzing
- Ecommerce sales performance
- Supply chain operations
- Marketing attribution
- Product and operational data
Work Environment
- Hybrid work model with 2–3 days per week in office
- Collaboration with a small technical team including IT and data science
- Fast-paced environment with significant opportunity to influence the company’s data strategy
- High level of autonomy and ownership over technical solutions
What We're Looking For
- Curious and evidence-driven
- Comfortable working with ambiguity
- Self-directed and proactive
- Passionate about learning new technologies
- A strong problem solver who enjoys building scalable systems
Job Summary:
Our client is seeking a Senior Data Analytics Engineer (Customer Data) to join their team! This position is located in Irving, Texas.
Duties:
- Support cross-functional teams including Marketing, Data Science, Product, and Digital
- Build datasets that power: customer segmentation, personalization workflows, campaign and lifecycle analytics, BI dashboards and KPIs and real-time and ML-driven customer experiences
- Build, optimize, and maintain customer data pipelines using PySpark/Databricks
- Transform raw customer data into analytics‑ready datasets for reporting, segmentation, personalization, and AI/ML applications
- Develop customer behavior metrics, campaign insights, and lifecycle reporting layers
- Design datasets used by Power BI/Tableau; dashboard creation is a plus, not required
- Optimize Databricks performance such as: skewed joins, partitioning, sorting, caching/persist strategy
- Work across AWS/Azure/GCP and integrate pipelines with CDPs
- Participate in ingestion and digestion phases to shape MarTech and BI analytical layers
- Document and uphold data engineering standards, governance, and best practices across teams
Desired Skills/Experience:
- 6+ years in Data Engineering or Analytics Engineering
- Strong hands-on experience with: Databricks, PySpark, Python and SQL
- Proven experience with customer/marketing data: segmentation, personalization, campaign analytics, retention, behavioral metrics
- Ability to design performance‑optimized pipelines; batch or near real-time
- Experience building datasets consumed by Power BI/Tableau
- Understanding of CDP workflows, customer identity data, traits/feature modeling, and activation
- Strong communication skills, translating marketing needs into technical data solutions
- Power BI expertise, major plus
- Experience with Delta Lake, orchestration, or feature engineering for ML
- Background as an Analytics Engineer, BI/Data Modeling Engineer, or Data Engineer with strong analytics orientation
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $140,000. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
A third-generation company with nearly 1,500 associates across six manufacturing facilities in South Carolina, Georgia, Florida, and Virginia, Metromont is a leader and pioneer in the engineering and manufacturing of precast concrete.
Most of all, we're a trusted partner, working side-by-side with our customers from the earliest stages of project design through turnover of the completed structure.
In addition to the production of precast concrete, Metromont provides our customers with complementary design and engineering, hauling, erection, and field services to support their full construction needs.
Across the eastern seaboard, the southeast, and even as far west as Arizona, our customers rely on us to provide innovative precast solutions and the best quality for their parking structures, data centers, multifamily housing, office buildings, warehouses, schools, and stadiums.
And we do, because a trusted partner is who we are
- and who we've been for nearly a century.
Senior Structural Engineer JOB DATA Department Code: 807X Account Code: 701000 Department Name: Engineering Account Name: Non-Plant Exempt POSITION PURPOSE This position includes senior-level structural engineers with PE certification who provide leadership, training, industry experience, and technical expertise.
The Senior Structural Engineer has the ability to take on technically complicated, complex projects.
RESPONSIBILITIES Perform engineering calculations and details for all products and connections.
Independently perform lateral analysis and design; troubleshoot and check laterals.
Lead project teams.
Independently manage engineering design aspects of a project with minimal assistance or guidance.
Review contract documents to be familiar with project requirements.
Attend project meetings and lead coordination meetings.
Write and review complex requests for information (RFI).
Resolve design issues independently and assist others with solving engineering design related problems.
Serve as point of contact for Metromont engineers and subcontractors for resolution of more complicated technical questions and problems.
Check engineering design calculations and details of others including that of external consultants for accuracy, efficiency, and adherence to Metromont standards and Engineering Design Process.
Review erection drawings, master shop tickets, and any additional shop tickets to ensure designs and standards are followed.
Aware of production through regular plant visits and participation in plant meetings relevant to assigned projects.
Become licensed in states where engineering work is performed and, when assigned, reviews and stamps erection drawings, calculations, and repair details.
Evaluate design cost as compared to estimate and take appropriate action.
Complete repairs without assistance.
May be an active participant in external industry organizations such as PCI and ACI.
Participate in pre-sale engineering design processes as requested by the sales department.
Must adhere to all Metromont and OSHA safety rules and regulations.
SCOPE OF AUTHORITY Works independently with little supervision Makes decisions related to their own projects regarding assignment of tasks Provide guidance and coaching to Design Engineers; interact closely with project team including project managers, general managers, drafting, and production Reports to the Engineering Manager CHARACTERISTICS (Knowledge, Skills, and Abilities) 7 years of relevant engineering experience
- internal or external Previous precast concrete engineering design knowledge preferred Highly dependable with strong work ethic Eager to learn Able to work individually or on a team Strongly values relationships and interaction with people Maintains a balanced perspective about change; adapts when necessary while placing value in consistency of processes Positive outlook Computer skills (experience with engineering design software preferred) Analytical thinker with above average problem-solving skills Attention to detail and accuracy Strong personal organization skills Above average ability to manage multiple priorities Self-motivated Values teaching and demonstrates a willingness to develop others Demonstrated ability to managed multiple projects and priorities, maintain project schedules, and work effectively within a project team Above average assertiveness; proactively addresses project issues Strong communication skills Able to document ongoing information on projects for record keeping purposes Able to adapt to changes in work schedules, tasks, or processes Values and demonstrates safe working behaviors EDUCATION AND TECHNOLOGY BS Civil Engineering required; structural emphasis preferred PE Certification required WORK ENVIRONMENT / SCHEDULE Typically works inside in an office environment Monday
- Friday, 8am
- 5pm; schedule flexibility may be required to meet deadlines PERSONAL PROTECTION EQUIPMENT (PPE) Safety glasses High-visibility vest Hard hat Steel-toed shoes Hearing protection PPE only required with working in the plant PHYSICAL REQUIREMENTS This is an office position which requires walking, standing, and sitting.
Disclaimer: This job description is not intended to be all-inclusive.
Other duties as assigned may be required.
All associates are expected to conduct themselves in a manner that is consistent with Metromont's core values and to actively participate in all company safety, training, and observation programs.
Metromont LLC (Company) is an equal opportunity employer.
The Company is committed to the spirit and letter of all federal, state and local laws and regulations pertaining to equal opportunity.
To this end, the Company does not discriminate against any individual with regard to race, color, religion, sex, gender identity, sexual orientation, pregnancy (including medical needs due to pregnancy, child birth or other medical conditions), national origin, age, disability, genetic information, veteran status, or other protected status.
This Policy extends to all terms, conditions and privileges of employment, as well as the use of all Company facilities.
The Company is also committed to making reasonable accommodations based on an individual's disability, religion, pregnancy, childbirth and related medical conditions (including, but not limited to, lactation), or any other protected status where a reasonable accommodation is required under the law.
No form of unlawful discrimination, unlawful harassment, unlawful refusal to reasonably accommodate or unlawful retaliation will be tolerated.
Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant.
However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
41 CFR 60-1.35(c) Job Details Pay Type Salary Education Level Bachelor's Degree PIb3d14ad5-