Cogent Data Solutions Llc Jobs in Usa
15,293 positions found
Visa Status: US Citizen or Green Card Only
Location: Irving, TX (Local Candidates Only)
Employment Type: Full-time / Direct Hire
Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)
***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***
***US Citizen or Green Card Only***
The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.
Responsibilities:
- Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
- Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
- Responsible for data cleansing, standardization, and quality control
- Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
- Create data models and ETL procedures with traceability, data lineage and source control
- Design and implement data integration and data quality framework
- Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
- Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
- Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
- Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
- Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
- Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.
Target Skills and Experience
- Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
- At least 10 years’ experience designing and managing data pipelines, schema modeling, and data processing systems.
- Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
- Experience with SAP a plus
- Proficient in Python, with a track record of solving real-world data challenges.
- Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
- Experience with Terraform or other infrastructure-as-code tools is a plus.
**Must be able to be onsite in Farmington, CT 2 days a week for collaboration**
The Opportunity: We are seeking a software engineer/developer or ETL/data integration/big data developer with experience in projects emphasizing data processing and storage. This person will be responsible for supporting the data ingestion, transformation, and distribution to end consumers. Candidate will perform requirements analysis, design/develop process flow, unit and integration tests, and create/update process documentation.
· Work with the Business Intelligence team and operational stakeholders to design and implement both the data presentation layer available to the user community, as well as the underlying technical architecture of the data warehousing environment. · Develop scalable and reliable data solutions to move data across systems from multiple sources in real time as well as batch modes. · Design and develop database objects, tables, stored procedures, views, etc. · Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end · Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases · Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake · Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server) · Maintain comprehensive project documentation · Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement. · Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members. · Manage multiple projects, responsibilities and competing priorities.
Requirements Experience Needed: · Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB · Database platforms such as: Oracle, SQL Server, MySQL · Big data concepts and technologies such as Synapse & Databricks · AWS and Azure cloud computing · HVR data replication
Location: Anywhere in Country
At EY, we’re all in to shape your future with confidence.
We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world.
AI & Data - Data Architecture – Senior Manager – Power & Utilities Sector
EY is seeking a motivated professional with solid experience in the utilities sector to serve as a Senior Manager who possesses a robust background in Data Architecture, Data Modernization, End to end Data capabilities, AI, Gen AI, Agentic AI, preferably with a power systems / electrical engineering background and having delivered business use cases in Transmission / Distribution / Generation / Customer. The ideal candidate will have a history of working for consulting companies and be well-versed in the fast-paced culture of consulting work. This role is dedicated to the utilities sector, where the successful candidate will craft, deploy, and maintain large-scale AI data ready architectures.
The opportunity
You will help our clients enable better business outcomes while working in the rapidly growing Power & Utilities sector. You will have the opportunity to lead and develop your skill set to keep up with the ever-growing demands of the modern data platform. During implementation you will solve complex analytical problems to bring data to insights and enable the use of ML and AI at scale for your clients. This is a high growth area and a high visibility role with plenty of opportunities to enhance your skillset and build your career.
As a Senior Manager in Data Architecture, you will have the opportunity to lead transformative technology projects and programs that align with our organizational strategy to achieve impactful outcomes. You will provide assurance to leadership by managing timelines, costs, and quality, and lead both technical and non-technical project teams in the development and implementation of cutting-edge technology solutions and infrastructure. You will have the opportunity to be face to face with external clients and build new and existing relationships in the sector. Your specialized knowledge in project and program delivery methods, including Agile and Waterfall, will be instrumental in coaching others and proposing solutions to technical constraints.
Your key responsibilities
In this pivotal role, you will be responsible for the effective management and delivery of one or more processes, solutions, and projects, with a focus on quality and effective risk management. You will drive continuous process improvement and identify innovative solutions through research, analysis, and best practices. Managing professional employees or supervising team members to deliver complex technical initiatives, you will apply your depth of expertise to guide others and interpret internal/external issues to recommend quality solutions. Your responsibilities will include:
As Data Architect – Senior Manager, you will have an expert understanding of data architecture and data engineering and will be focused on problem-solving to design, architect, and present findings and solutions, leading more junior team members, and working with a wide variety of clients to sell and lead delivery of technology consulting services. You will be the go-to resource for understanding our clients’ problems and responding with appropriate methodologies and solutions anchored around data architectures, platforms, and technologies. You are responsible for helping to win new business for EY. You are a trusted advisor with a broad understanding of digital transformation initiatives, the analytic technology landscape, industry trends and client motivations. You are also a charismatic communicator and thought leader, capable of going toe-to-toe with the C-level in our clients and prospects and willing and able to constructively challenge them.
Skills and attributes for success
To thrive in this role, you will need a combination of technical and business skills that will make a significant impact. Your skills will include:
- Technical Skills Applications Integration
- Cloud Computing and Cloud Computing Architecture
- Data Architecture Design and Modelling
- Data Integration and Data Quality
- AI/Agentic AI driven data operations
- Experience delivering business use cases in Transmission / Distribution / Generation / Customer.
- Strong relationship management and business development skills.
- Become a trusted advisor to your clients’ senior decision makers and internal EY teams by establishing credibility and expertise in both data strategy in general and in the use of analytic technology solutions to solve business problems.
- Engage with senior business leaders to understand and shape their goals and objectives and their corresponding information needs and analytic requirements.
- Collaborate with cross-functional teams (Data Scientists, Business Analysts, and IT teams) to define data requirements, design solutions, and implement data strategies that align with our clients’ objectives.
- Organize and lead workshops and design sessions with stakeholders, including clients, team members, and cross-functional partners, to capture requirements, understand use cases, personas, key business processes, brainstorm solutions, and align on data architecture strategies and projects.
- Lead the design and implementation of modern data architectures, supporting transactional, operational, analytical, and AI solutions.
- Direct and mentor global data architecture and engineering teams, fostering a culture of innovation, collaboration, and continuous improvement.
- Establish data governance policies and practices, including data security, quality, and lifecycle management.
- Stay abreast of industry trends and emerging technologies in data architecture and management, recommending innovations and improvements to enhance our capabilities.
To qualify for the role, you must have
- A Bachelor’s degree required in STEM
- 12+ years professional consulting experience in industry or in technology consulting.
- 12+ years hands-on experience in architecting, designing, delivering or optimizing data lake solutions.
- 5+ years’ experience with native cloud products and services such as Azure or GCP.
- 8+ years of experience mentoring and leading teams of data architects and data engineers, fostering a culture of innovation and professional development.
- In-depth knowledge of data architecture principles and best practices, including data modelling, data warehousing, data lakes, and data integration.
- Demonstrated experience in leading large data engineering teams to design and build platforms with complex architectures and diverse features including various data flow patterns, relational and no-SQL databases, production-grade performance, and delivery to downstream use cases and applications.
- Hands-on experience in designing end-to-end architectures and pipelines that collect, process, and deliver data to its destination efficiently and reliably.
- Proficiency in data modelling techniques and the ability to choose appropriate architectural design patterns, including Data Fabrics, Data Mesh, Lake Houses, or Delta Lakes.
- Manage complex data analysis, migration, and integration of enterprise solutions to modern platforms, including code efficiency and performance optimizations.
- Previous hands‑on coding skills in languages commonly used in data engineering, such as Python, Java, or Scala.
- Ability to design data solutions that can scale horizontally and vertically while optimizing performance.
- Experience with containerization technologies like Docker and container orchestration platforms like Kubernetes for managing data workloads.
- Experience in version control systems (e.g. Git) and knowledge of DevOps practices for automating data engineering workflows (DataOps).
- Practical understanding of data encryption, access control, and security best practices to protect sensitive data.
- Experience leading Infrastructure and Security engineers and architects in overall platform build.
- Excellent leadership, communication, and project management skills.
- Data Security and Database Management
- Enterprise Data Management and Metadata Management
- Ontology Design and Systems Design
Ideally, you’ll also have
- Master’s degree in Electrical / Power Systems Engineering, Computer science, Statistics, Applied Mathematics, Data Science, Machine Learning or commensurate professional experience.
- Experience working at big 4 or a major utility.
- Experience with cloud data platforms like Databricks.
- Experience in leading and influencing teams, with a focus on mentorship and professional development.
- A passion for innovation and the strategic application of emerging technologies to solve real-world challenges.
- The ability to foster an inclusive environment that values diverse perspectives and empowers team members.
- Building and Managing Relationships
- Client Trust and Value and Commercial Astuteness
- Communicating With Impact and Digital Fluency
What we look for
We are looking for top performers who demonstrate a blend of technical expertise and business acumen, with the ability to build strong client relationships and lead teams through change. Emotional agility and hybrid collaboration skills are key to success in this dynamic role.
FY26NATAID
What we offer you
At EY, we’ll develop you with future-focused skills and equip you with world-class experiences. We’ll empower you in a flexible environment, and fuel you and your extraordinary talents in a diverse and inclusive culture of globally connected teams. Learn more.
- We offer a comprehensive compensation and benefits package where you’ll be rewarded based on your performance and recognized for the value you bring to the business. The base salary range for this job in all geographic locations in the US is $144,000 to $329,100. The base salary range for New York City Metro Area, Washington State and California (excluding Sacramento) is $172,800 to $374,000. Individual salaries within those ranges are determined through a wide variety of factors including but not limited to education, experience, knowledge, skills and geography. In addition, our Total Rewards package includes medical and dental coverage, pension and 401(k) plans, and a wide range of paid time off options.
- Join us in our team‑led and leader‑enabled hybrid model. Our expectation is for most people in external, client serving roles to work together in person 40-60% of the time over the course of an engagement, project or year.
- Under our flexible vacation policy, you’ll decide how much vacation time you need based on your own personal circumstances. You’ll also be granted time off for designated EY Paid Holidays, Winter/Summer breaks, Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well‑being.
Are you ready to shape your future with confidence? Apply today.
EY accepts applications for this position on an on‑going basis.
For those living in California, please click here for additional information.
EY focuses on high‑ethical standards and integrity among its employees and expects all candidates to demonstrate these qualities.
EY | Building a better working world
EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.
Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.
EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.
EY provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, genetic information, national origin, protected veteran status, disability status, or any other legally protected basis, including arrest and conviction records, in accordance with applicable law.
EY is committed to providing reasonable accommodation to qualified individuals with disabilities including veterans with disabilities. If you have a disability and either need assistance applying online or need to request an accommodation during any part of the application process, please call 1-800-EY-HELP3, select Option 2 for candidate related inquiries, then select Option 1 for candidate queries and finally select Option 2 for candidates with an inquiry which will route you to EY’s Talent Shared Services Team (TSS) or email the TSS at .
#J-18808-Ljbffr
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
The Data Engineering Manager is responsible for leading and developing a team of Data Architects and Data Solutions Engineers while actively contributing to hands-on technical projects. This role will manage the data warehouse in Snowflake, engineering automations in Alteryx and/or other solutions, while ensuring efficient project intake and prioritization. The ideal candidate combines strong technical expertise with proven technical leadership skills to drive innovation and operational excellence across the data engineering function.
As a Data Engineering Manager, you will:
- Set the technical strategy for data engineering solutions and data architecture which includes end to end data pipeline strategy, consumption management, project scoping, and data automation.
- Design, develop, and optimize data engineering solutions using Snowflake, DBT, Azure Data Factory, and Alteryx.
- Continuously assess and optimize the data engineering technology stack to ensure scalability, performance, and alignment with industry best practices.
- Implement best practices for data modeling, ETL/ELT processes, and automation.
- Own and maintain the Snowflake data warehouse roadmap and engineering standards.
- Lead data project scoping, prioritization, and resource allocation to ensure timely delivery of data engineering solutions.
- Ensure data integrity, security, and compliance across all engineering solutions.
- Collaborate with IT and rest of data teams to align solutions with enterprise
- Establish documentation and governance standards for data engineering workflows ensuring completeness, audit readiness, and traceability in alignment with enterprise architecture.
- Directly supervise the Data Architecture & Data Engineering team in accordance with Nicolet's policies and applicable laws. Responsibilities include interviewing, hiring, and training employees; planning, assigning, and directing work; appraising performance; coaching, mentoring and development planning; rewarding and disciplining employees; addressing complaints and resolving problems.
Qualifications:
- Bachelor's degree in Computer Science, Data Engineering, Data Analytics or related field.
- 7+ years in data engineering or related data roles required.
- 3+ years in leadership or management positions required.
- Strong technical expertise in Snowflake, DBT, Azure Data Factory and SQL or like systems.
- Familiarity with Alteryx, UiPath, Tableau, Power BI and Salesforce is preferred.
- Ability to design and implement scalable data solutions.
- Excellent leadership, communication, and organizational skills
- Ability to balance hands-on development with team development.
- Must be able to work fully in-office. This position does not allow for remote work.
Benefits:
- Medical, Dental, Vision, & Life Insurance
- 401(k) with a company match
- PT0 & 11 1/2 Paid Holidays
The above statements are intended to describe the general nature and level of work being performed. They are not intended to be construed as an exhaustive list of all responsibilities and skills required for the position.
Equal Opportunity Employer/Veterans/Disabled
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Job Title: Senior Data Engineer / Analytics Engineer
Location: West Los Angeles, CA (Onsite)
Compensation: $180,000 base salary + 10% bonus
Overview
We are looking for a Senior Data Engineer / Analytics Engineer to help architect and build scalable data solutions that power business insights for sales and marketing teams. This role is ideal for someone who enjoys being both strategic and hands-on, designing modern data architectures while actively building pipelines, models, and dashboards.
The ideal candidate has deep experience in modern data stack technologies and has worked closely with high-volume sales and marketing organizations, particularly supporting Salesforce-driven environments.
Key Responsibilities
Data Architecture & Engineering
- Design and build scalable data pipelines and data models that support analytics and reporting across the organization.
- Architect and implement solutions using Snowflake, DBT, Python, and Fivetran within a modern data stack.
- Optimize Snowflake environments for cost and performance, including warehouse configuration, query optimization, and storage strategies.
- Build and maintain robust data transformation pipelines using DBT for modeling, testing, and validation.
Analytics & Business Intelligence
- Develop high-impact dashboards and reporting solutions using Power BI to support decision-making across the business.
- Partner with stakeholders to define KPIs, metrics, and data models that support sales and marketing performance tracking.
- Ensure data reliability, consistency, and accessibility across analytics platforms.
CRM Data & Sales Analytics
- Work extensively with Salesforce data, helping clean, structure, and optimize complex CRM datasets.
- Design scalable data models that support reporting on sales performance, marketing attribution, pipeline analytics, and revenue metrics.
- Implement solutions to improve data quality and usability across CRM-driven reporting.
Business Partnership
- Partner closely with Sales and Marketing teams in a high-volume sales environment to understand reporting needs and deliver actionable insights.
- Translate business questions into scalable data solutions and analytics frameworks.
- Communicate technical concepts clearly to non-technical stakeholders and collaborate effectively across teams.
Required Qualifications
- 5+ years of BI Engineering, Data Engineering, or Analytics Engineering experience.
- Proven experience acting as both a data architect and hands-on builder.
- Strong experience with:
- Snowflake (including cost and performance optimization)
- DBT for transformations, modeling, and data validations
- Python
- Power BI - must have
- Experience working with Salesforce data, including cleaning, structuring, and building scalable reporting solutions for complex CRM datasets. or similar CRM tools.
- Experience supporting Sales and Marketing teams in high-volume sales environments.
- Strong communication skills and ability to work collaboratively with cross-functional stakeholders.
Preferred Qualifications
- Experience with Salesforce data architecture and CRM analytics.
- Background working with large-scale sales operations or marketing analytics teams.
- Experience building modern ELT data pipelines and scalable analytics frameworks.
Work Environment
- Onsite role in West Los Angeles
- Highly collaborative environment working closely with data, sales, marketing, and leadership teams.
*At Securian Financial the internal position title is Data Science Sr Analyst or Data Science Consultant. The title and salary will be determined based on experience and applied skills.*
Summary
As an Operational Support Data Scientist at Securian Financial, you will bridge advanced analytics and day-to-day business operations by designing, deploying, monitoring, and continuously improving AI-driven solutions that support enterprise processes.
This role focuses on supporting reliable, scalable, and explainable AI solutions that enhance operational efficiency, decision support, customer experience, and risk management across Digital, Marketing, Sales, and Servicing functions.
You will operate at the intersection of data science, MLOps, and the business - ensuring models are maintained, enhanced, monitored, and aligned with Securian's Enterprise Data Strategy Vision and Operating Principles.
Responsibilities include but are not limited to:
AI Solution Development & Deployment
Work with business teams to enhance existing solutions to enhance and optimize existing AI/ML solutions.
Deploy and manage solutions using cloud-native tools (e.g., AWS SageMaker).
Operational Model Support & Optimization
Monitor model performance, data drift, and operational KPIs.
Troubleshoot production issues and continuously enhance and optimize models for performance, stability, and cost efficiency.
Establish measurement frameworks to quantify operational impact of deployed solutions.
Data Engineering & Analytical Execution
Transform structured, semi-structured, and unstructured data into actionable features and insights.
Perform exploratory analysis and visualization to identify operational improvement opportunities.
Collaborate with engineering teams to productionize data solutions.
Stakeholder Engagement & Explainability
Partner with cross-functional operational stakeholders to understand business workflows and translate them into AI-enabled solutions.
Communicate complex AI methodologies and results clearly to technical and non-technical audiences.
Ensure model transparency, explainability, fairness, and ethical AI application in alignment with enterprise governance standards.
Required Qualifications
Demonstrated experience developing, deploying, or supporting production AI/ML models in cloud environments.
Strong proficiency in Python and experience with tools such as AWS SageMaker and GitHub.
Experience building operationalized data science solutions (not just prototypes).
Strong understanding of statistical modeling, machine learning algorithms, and model validation techniques.
Ability to clearly explain technical concepts, model outputs, and operational trade-offs to stakeholders.
Strong ethical judgment with a commitment to responsible and unbiased AI development.
Preferred Qualifications
2+ years of hands-on experience in data science, applied AI, or machine learning.
Experience supporting AI solutions in operational or production environments.
Familiarity with MLOps practices, model governance frameworks, and automation tooling.
Experience working in regulated industries (financial services preferred).
#LI-hybrid **This position will be in a hybrid working arrangement.**
Securian Financial believes in hybrid work as an integral part of our culture. Associates get the benefit of working both virtually and in our offices. If you're in a commutable distance (90 minutes) you'll join us 3 days each week in our offices to collaborate and build relationships. Our policy allows flexibility for the reality of business and personal schedules.
The estimated base pay range for this job is:
$72,000.00 - $134,000.00Pay may vary depending on job-related factors and individual experience, skills, knowledge, etc. More information on base pay and incentive pay (if applicable) can be discussed with a member of the Securian Financial Talent Acquisition team.
Be you. With us. At Securian Financial, we understand that attracting top talent means offering more than just a job - it means providing a rewarding and fulfilling career. As a valued member of our high-performing team, we want you to connect with your work, your relationships and your community. Enjoy our comprehensive range of benefits designed to enhance your professional growth, well-being and work-life balance, including the advantages listed here:
Paid time off:
We want you to take time off for what matters most to you. Our PTO program provides flexibility for associates to take meaningful time away from work to relax, recharge and spend time doing what's important to them. And Securian Financial rewards associates for their service by providing additional PTO the longer you stay at Securian.
Leave programs: Securian's flexible leave programs allow time off from work for parental leave, caregiver leave for family members, bereavement and military leave.
Holidays: Securian provides nine company paid holidays.
Company-funded pension plan and a 401(k) retirement plan: Share in the success of our company. Securian's 401(k) company contribution is tied to our performance up to 10 percent of eligible earnings, with a target of 5 percent. The amount is based on company results compared to goals related to earnings, sales and service.
Health insurance: From the first day of employment, associates and their eligible family members - including spouses, domestic partners and children - are eligible for medical, dental and vision coverage.
Volunteer time: We know the importance of community. Through company-sponsored events, volunteer paid time off, a dollar-for-dollar matching gift program and more, we encourage you to support organizations important to you.
Associate Resource Groups: Build connections, be yourself and develop meaningful relationships at work through associate-led ARGs. Dedicated groups focus on a variety of interests and affinities, including:
Mental Wellness and Disability
Pride at Securian Financial
Securian Young Professionals Network
Securian Multicultural Network
Securian Women and Allies Network
Servicemember Associate Resource Group
For more information regarding Securian's benefits, please review our Benefits page.
This information is not intended to explain all the provisions of coverage available under these plans. In all cases, the plan document dictates coverage and provisions.
Securian Financial Group, Inc. does not discriminate based on race, color, religion, national origin, sex, gender, gender identity, sexual orientation, age, marital or familial status, pregnancy, disability, genetic information, political affiliation, veteran status, status in regard to public assistance or any other protected status. If you are a job seeker with a disability and require an accommodation to apply for one of our jobs, please contact us by email at , by telephone (voice), or 711 (Relay/TTY).
To view our privacy statement click here
To view our legal statement click here
Remote working/work at home options are available for this role.
Job: Data-MDM Architect (Profisee) with BA/PM experience
Location: Waukesha/Milwaukee, Wisconsin
Mode: Work from office, at least 3 days in a week
Primary Purpose
- Responsible for designing and architecting data/MDM solutions, analyzing, implementing, and deploying these solutions both on-premises and in the cloud. By collaborating with diverse business teams and utilizing extensive knowledge of big data tools and products, creates scalable, flexible, and comprehensive data solutions that tackle complex business challenges.
Major Responsibilities
- Manage the technical delivery of medium to large, moderately complex projects on-time with targeted zero defects.
- Provide planning, estimation, scheduling, prioritization and coordination of technical activities related to Enterprise-wide data solutions on both cloud and on premises.
- Ensure solutions alignment to Enterprise Architecture policies and best practices; ensure that process methodologies are followed in development.
- Accountable to business and technology management for end-to-end application scoping, planning, development and delivery that meets and exceeds quality standards.
- Identify and manage dependencies and downstream impacts of the project to minimize adverse effects on other projects and / or programs.
- Assist Project manager with the estimation of technical timelines and allocation of the technical resources to specific task.
- Communicate Expectations, Roles and Responsibilities to team members and hold them accountable to meet the expectations.
- Collaborate with IT partners to devise capacity plan and ensure appropriate infrastructure for the end-to-end system delivery.
- Supervise contingent workers and their daily tasks including onshore and offshore staff.
- Identify valuable data sources and automate collection processes.
- Maintain data accuracy and timeliness, a critical highly visible aspect of the position as it impacts supply chain and sales effectiveness, financial performance of the business, and customer perception through on-time delivery, working capital, financial reporting accuracy and product quality.
- Architect and design master data to drive towards “Single source of the truth”.
- Regularly monitor and measure performance of MDM standards.
- Performs problem and trend analyses to identify and correct problems and increase data quality.
- Review / Approve execution of data changes.
- Track and report through the CAB review board.
- Develop SLA’s and ensure they are met.
- Drive data mapping workshops for migrations.
- Coordinate and participate in the ETL (extract, transform, load) process for any migrations.
- Plan and architect M&A initiatives and integrations