Array Declaration In Data Structure Jobs Salary Jobs in Usa

39,019 positions found — Page 2

Data Analyst
✦ New
Salary not disclosed
Raleigh, NC 1 day ago
Job Title: Data Analyst

Location: Remote

Duration: 4-month, possible extension

Hours: M-F 8am - 5pm EST or CST preferred

Why is this role open? (Coverage, looking for perm, etc.) - To complete an ongoing project

Potential to convert to FTE, If so, what rate:

Possible if headcount is available

Overview of Work Environment/Client Nuances:

Potentially some interaction with the client so they will need to have excellent communication skills

Team Overview:

will work closely with H.M and team of data analysts

Resource's typical working day:


  • Data management
  • May do some vendor management
  • Some data presentation
  • Some process improvement
  • Data mining to assist operations around lab equipment maintenance
  • Some vendor management


About the Role:


  • As a Client Data Analyst, you will perform basic analysis to ensure that recommendations and business conclusions are backed by thorough data research and findings.
  • This job is part of the Data Science & Analytics job function. They are responsible for reviewing data that supports improving effectiveness and predicting outcomes to develop business intelligence.


What You'll Do:


  • Coordinate data aggregation and curate reports using existing business intelligence and reporting applications.
  • Perform ad-hoc, strategic review of structured and unstructured data, reflecting global real estate markets and the operations of real estate assets.
  • Assist with developing data structures and pipelines to organize, collect, cleanse, and standardize information to generate insights.
  • Define basic data requirements and gather information using judgment and statistical tests.
  • Use programming and evaluation tools, including open-source programs to plan models and extract insights.
  • Apply modeling and optimization methods to improve business performance.
  • Develop ad-hoc reporting based on the review of existing data sources.
  • Exhibit rigor, judgment, and ability to present a detailed 'data story' to a business line.
  • Confirm the quality and integrity of existing data sources.
  • Collaborate with the agile development team to provide recommendations and communications on enhancing existing or new processes and programs.
  • Have some knowledge of standard principles with limited practical experience in applying them.
  • Lead by example and model behaviors that are consistent with Client RISE values.
  • Impact the quality of own work.
  • Work within standardized procedures and practices to achieve objectives and meet deadlines.
  • Exchange straightforward information, ask questions, and check for understanding.


What You'll Need:


  • Bachelor's Degree preferred with up to 3 years of relevant experience. In lieu of a degree, a combination of experience and education will be considered. MCSE and CNE Certification preferred.
  • Ability to use existing procedures to solve standard problems.
  • Experience with analyzing information and standard practices to make judgments.
  • In-depth knowledge of Microsoft Office products. Examples include Word, Excel, Outlook, etc.
  • Organizational skills with a strong inquisitive mindset.


Licenses/Certifications: n/a

Must Have Skills:


  • Extreme attention to detail
  • Great with time management and has a sense of urgency to complete tasks
  • Excellent communication skills
  • Will need to be able to "tell the story of the data"


Nice to have skills:


  • Experience with pharmaceutical equipment management is a huge plus
  • Vantage experience


Years of Experience:


  • 4-7 years


Education

Bachelor's degree highly preferred, will accept experience in lieu of.

Software skills:


  • Excel
  • Microsoft Office suit
  • Vantage
  • Smart sheets


Interview Process:

1 round, virtual (Teams) with H.M.

Not Specified
Associate Partner, Data and Technology Transformation
✦ New
$250 +
Chicago, IL 1 day ago
Introduction
Your role and responsibilities
About the Opportunity

IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.


You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.


This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.


Key Responsibilities
Market Leadership & Growth

  • Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.


  • Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.


  • Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.



Strategic Advisory & Transformation Delivery

  • Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.


  • Architect integrated solutions that include:


  • Migration from legacy data platforms to modern cloud-based architectures


  • Data engineering and Information governance


  • Business intelligence and advanced analytics


  • GenAI-powered and agentic AI-driven automation and decisioning


  • Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.



Engagement Excellence & Financial Stewardship

  • Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.


  • Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.


  • Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.



Practice Building & Talent Development

  • Recruit, mentor, and grow top-tier consultants, architects, and data specialists.


  • Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.


  • Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.



Thought Leadership & Market Presence

  • Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.


  • Represent IBM at industry conferences, client events, webinars, and executive roundtables.


  • Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.



This position can be preformed anywhere in the US.


"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."


Required technical and professional expertise
Qualifications

  • 12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.


  • Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.


  • Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:


  • AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)


  • Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)


  • Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)


  • Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.


  • Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.


  • Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.


  • Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.


  • Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.


  • Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.


  • Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.



Preferred technical and professional experience

IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.


#J-18808-Ljbffr
Not Specified
Senior Data Engineer
✦ New
Salary not disclosed
Boston, Massachusetts 11 hours ago

About the Company

Our client is a well-established global insurance and financial services organization known for its strong reputation, financial stability, and long-term commitment to innovation. The company is investing heavily in modern technology and data platforms to transform how insurance products are delivered to professionals and small businesses.

Within this organization, a rapidly growing digital product team is focused on simplifying insurance through modern technology, data-driven decision making, and scalable cloud infrastructure. The team operates in a collaborative, fast-paced environment and places a strong emphasis on engineering excellence, ownership, and continuous improvement.

About the Role

Our client is seeking a Senior Data Engineer to join their Platform Engineering team in Boston. This role will play a key part in designing, building, and scaling a modern cloud-native data platform used to support analytics, business intelligence, and data-driven decision making across the organization.

This individual will work closely with engineering leaders, data teams, and business stakeholders to develop scalable data integration pipelines and build the foundation for enterprise analytics. The role is highly hands-on and ideal for someone who enjoys building data platforms, working with cloud technologies, and enabling self-service analytics.

The team operates in an Agile environment and values collaboration, ownership, and continuous improvement.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Azure Data Factory, Databricks, and SQL Server
  • Build configuration-driven ingestion processes to support batch and near real-time data pipelines
  • Integrate and transform data from multiple sources including APIs, JSON, CSV, XLS, and other structured and semi-structured formats
  • Design and document data models, data flows, and data dictionaries to support enterprise analytics
  • Implement Lakehouse architecture using Medallion data modeling patterns
  • Develop datasets and reporting pipelines supporting Power BI and enterprise analytics
  • Maintain the health, security, and scalability of the Azure-based data platform
  • Implement best practices for data governance, metadata management, lineage, and access control
  • Build and maintain CI/CD pipelines for data infrastructure using tools such as GitHub Actions or CircleCI
  • Implement monitoring, alerting, and observability for data pipelines and platform performance
  • Optimize data pipelines for performance, scalability, and cost efficiency
  • Work cross-functionally with engineering, product, and business teams to define and deliver data products
  • Participate in Agile ceremonies including sprint planning, backlog refinement, and delivery cycles
  • Mentor junior engineers and contribute to the ongoing development of the data engineering team
  • Stay current on emerging technologies and best practices in data engineering

Qualifications

  • 7+ years of professional experience in data engineering or data platform development
  • Strong experience working in Azure environments, particularly: Azure Data Factory, Databricks, SQL Server / T-SQL
  • Experience building scalable ETL / ELT data pipelines
  • Experience integrating data from multiple sources including APIs, JSON, XML, CSV, and other formats
  • Strong understanding of data warehousing, dimensional modeling, and Lakehouse architecture
  • Experience with Power BI, SSRS, or other reporting platforms
  • Hands-on experience with CI/CD pipelines and DataOps practices
  • Familiarity with data governance, data quality, metadata management, and MDM
  • Experience working in Agile development environments
  • Strong communication and collaboration skills with the ability to work across technical and business teams
  • Experience mentoring or guiding other engineers is a plus
Not Specified
SAP S/4HANA Functional Process Data Expert
Salary not disclosed
Atlanta 3 days ago
Summary: Location: Atlanta, GA Duration: 12 Months 100% Remote – open to any area Responsibilities: Partner with global and regional business stakeholders to define data requirements aligned to standardized value stream processes.

Translate business process designs into clear master and transactional data definitions for S/4HANA.

Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.

Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).

Define data objects, attributes, and mandatory fields.

Support business rules, validations, and derivations.

Align data structures to SAP best practices and industry standards.

Support data cleansing, enrichment, and harmonization activities.

Define and validate data mapping rules from legacy systems to S/4HANA.

Participate in mock conversions, data loads, and reconciliation activities.

Ensure data quality thresholds are met prior to cutover.

Support the establishment and enforcement of global data standards and policies.

Work closely with Master Data and Data Governance teams.

Help define roles, ownership, and stewardship models for value stream data.

Contribute to data quality monitoring and remediation processes.

Support functional and integrated testing with a strong focus on data accuracy.

Validate business scenarios using migrated and created data.

Support cutover planning and execution from a data perspective.

Provide post-go-live support and stabilization.

Requirements: 5 years of SAP functional experience with a strong data focus.

Hands-on experience with SAP S/4HANA (greenfield preferred).

Proven involvement in large-scale, global ERP implementations.

Deep understanding of value stream business processes and related data objects.

Experience supporting data migration, cleansing, and validation.

Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).

Understanding of S/4HANA data model changes vs.

ECC.

Experience working with SAP MDG or similar governance tools preferred.

Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).

Ability to read and interpret functional specs and data models.

Strong stakeholder management and communication skills.

Ability to work across global, cross-functional teams.

Detail-oriented with strong analytical and problem-solving skills.

Comfortable operating in a fast-paced transformation environment.

Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.

Prior role as Functional Data Lead or Data Domain Lead.

Experience defining global templates and harmonized data models.

Knowledge of data quality tools and metrics.

Experience with MGD and setting up cost center and profit center groups.
Not Specified
Data Integration & AI Engineer
✦ New
Salary not disclosed
Edison, NJ 1 day ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
Senior Data Architect
✦ New
Salary not disclosed
San Mateo, California 11 hours ago

Job Title: Health Data Services Strategy and Data Architect Manager

Location: San Francisco Bay Area

Work Mode: Hybrid Model – Onsite as Needed, at least 3-6 days a month

Duration: 3 months

Only Local candidates

Qualification:

• Healthcare analytics principles and performance measurement frameworks.

• Electronic Health Record systems and healthcare data structures.

• Data warehousing concepts and EPIC HB, PB, and Retail Pharmacy architecture necessary for internal and external reporting.

• Healthcare quality metrics and regulatory reporting requirements.

• Data governance principles and healthcare privacy regulations.

• Supervisory principles and budget preparation.

Knowledge, Skills, & Abilities

• Organize and evaluate healthcare analytics programs.

• Supervise and mentor professional and technical staff.

• Translate complex data into actionable insights.

• Balance competing priorities while maintaining alignment with organizational strategy.

• Facilitate collaborative decision-making and governance processes.

• Interpret and apply healthcare regulations within system configuration and documentation standards.

• Communicate effectively with executive and operational stakeholders.

• Prepare reports, policy recommendations, and budget justifications.

• Identify risks, propose solutions, and drive resolution of system or operational issues.

• Build and maintain effective working relationships across diverse service lines and departments.

Education:

• Any combination of education and experience that would likely provide the required knowledge, skills, and abilities is qualifying.

• A Bachelor's degree in public health, healthcare administration, data science, statistics, information systems, business administration, or related field; AND o Seven years of progressively responsible experience in healthcare analytics, healthcare IT, digital health, or performance reporting; INCLUDING o Three years of supervisory or leadership experience overseeing analysts or technical staff.

Not Specified
Solution Architect - Microsoft Purview (Data Catalog & Governance)
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 3 days ago
Title-: Solution Architect - Microsoft Purview (Data Catalog & Governance)

Duration-: 10+ Months

Location: Remote

Overview

An experienced Solution Architect to lead the enterprise rollout of Microsoft Purview across a complex global, multi cloud environment. The consultant will define architecture, implement domain?based governance, and drive adoption of Purview capabilities including cataloging, lineage, classification, access governance, and compliance controls.

Key Responsibilities


  • Architecture & Implementation
  • Define target?state architecture for Microsoft Purview across Azure, AWS, M365, on prem, and third party platforms.
  • Develop and drive the implementation roadmap across U.S. Businesses, PGIM, Corporate Technology, and international units.
  • Establish Purview reference architecture, integration patterns, and guardrails.
  • Domain Based Governance
  • Design collections, hierarchies, and RBAC aligned to domain structures and legal entity boundaries.
  • Enable domain owned stewardship while enforcing enterprise taxonomies and governance standards.
  • Platform Configuration
  • Configure Data Map, Catalog, Scans, Classifications, Sensitivity Labels, and Lineage.
  • Optimize scan strategy (frequency, cost, performance) and extend classifiers and metadata models.
  • Security & Compliance
  • Integrate Purview with M365 Information Protection, Entra ID, and security baselines.
  • Support PII/PCI/PHI detection, access governance, and regulatory compliance (SOX, GLBA, NYDFS, GDPR).
  • Engineering & Integration
  • Integrate with Synapse, Fabric, Databricks (including Unity Catalog), Snowflake, SQL Server, AWS sources, and SAP/Oracle.
  • Implement IaC (Bicep/Terraform), CI/CD for Purview artifacts, and automation via APIs.
  • Adoption & Stakeholder Management
  • Deliver training, onboarding playbooks, and steward enablement.
  • Lead workshops for new data domains and products.
  • Provide executive level reporting on progress, risks, and KPIs.


Required Qualifications


  • 10+ years in data architecture/governance; 2+ years hands on Purview experience at enterprise scale.
  • Strong expertise in metadata management, lineage, classification, scan optimization, glossary management and domain based operating models.
  • Solid Azure ecosystem knowledge (Storage, Key Vault, Synapse, Fabric, Databricks), M365 Information Protection, and Entra ID.
  • Experience with IaC (Bicep/Terraform), APIs/Atlas, and scripting (PowerShell/Python).
  • Financial services or regulated industry exposure.
  • Excellent communication, stakeholder leadership, and cross domain facilitation skills.


Not Specified
Data Product Engineer
🏢 Spectraforce Technologies
Salary not disclosed
Newark, NJ 2 days ago
Job Title: Marketplace Data Product Engineer

Duration: 6+ months

Location: 100% Remote

Job Overview

The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.

Key Responsibilities


  • Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
  • Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
  • Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
  • Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
  • Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
  • Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
  • Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
  • Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
  • Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
  • Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.


Required Skills & Qualifications


  • 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
  • Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
  • Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
  • Excellent communication and presentation skills, with experience translating technical concepts for business partners.
  • Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
  • Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
  • Strong collaboration and stakeholder management skills across engineering, product, and business teams.
  • Comfortable working in fast?moving environments and driving clarity through ambiguity.


Preferred Qualifications


  • Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
  • Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
  • Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
  • Background in solution architecture, customer engineering, or sales engineering.
  • Experience developing demo environments, sample apps, or repeatable platform enablement assets.
  • Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.


Not Specified
Data Science Sr Analyst (Hybrid)
✦ New
Salary not disclosed

*At Securian Financial the internal position title is Data Science Sr Analyst or Data Science Consultant. The title and salary will be determined based on experience and applied skills.*

Summary

As an Operational Support Data Scientist at Securian Financial, you will bridge advanced analytics and day-to-day business operations by designing, deploying, monitoring, and continuously improving AI-driven solutions that support enterprise processes.

This role focuses on supporting reliable, scalable, and explainable AI solutions that enhance operational efficiency, decision support, customer experience, and risk management across Digital, Marketing, Sales, and Servicing functions.

You will operate at the intersection of data science, MLOps, and the business - ensuring models are maintained, enhanced, monitored, and aligned with Securian's Enterprise Data Strategy Vision and Operating Principles.

Responsibilities include but are not limited to:

AI Solution Development & Deployment

  • Work with business teams to enhance existing solutions to enhance and optimize existing AI/ML solutions.

  • Deploy and manage solutions using cloud-native tools (e.g., AWS SageMaker).

Operational Model Support & Optimization

  • Monitor model performance, data drift, and operational KPIs.

  • Troubleshoot production issues and continuously enhance and optimize models for performance, stability, and cost efficiency.

  • Establish measurement frameworks to quantify operational impact of deployed solutions.

Data Engineering & Analytical Execution

  • Transform structured, semi-structured, and unstructured data into actionable features and insights.

  • Perform exploratory analysis and visualization to identify operational improvement opportunities.

  • Collaborate with engineering teams to productionize data solutions.

Stakeholder Engagement & Explainability

  • Partner with cross-functional operational stakeholders to understand business workflows and translate them into AI-enabled solutions.

  • Communicate complex AI methodologies and results clearly to technical and non-technical audiences.

  • Ensure model transparency, explainability, fairness, and ethical AI application in alignment with enterprise governance standards.

Required Qualifications

  • Demonstrated experience developing, deploying, or supporting production AI/ML models in cloud environments.

  • Strong proficiency in Python and experience with tools such as AWS SageMaker and GitHub.

  • Experience building operationalized data science solutions (not just prototypes).

  • Strong understanding of statistical modeling, machine learning algorithms, and model validation techniques.

  • Ability to clearly explain technical concepts, model outputs, and operational trade-offs to stakeholders.

  • Strong ethical judgment with a commitment to responsible and unbiased AI development.

Preferred Qualifications

  • 2+ years of hands-on experience in data science, applied AI, or machine learning.

  • Experience supporting AI solutions in operational or production environments.

  • Familiarity with MLOps practices, model governance frameworks, and automation tooling.

  • Experience working in regulated industries (financial services preferred).

#LI-hybrid **This position will be in a hybrid working arrangement.**

Securian Financial believes in hybrid work as an integral part of our culture. Associates get the benefit of working both virtually and in our offices. If you're in a commutable distance (90 minutes) you'll join us 3 days each week in our offices to collaborate and build relationships. Our policy allows flexibility for the reality of business and personal schedules.

The estimated base pay range for this job is:

$72,000.00 - $134,000.00

Pay may vary depending on job-related factors and individual experience, skills, knowledge, etc. More information on base pay and incentive pay (if applicable) can be discussed with a member of the Securian Financial Talent Acquisition team.

Be you. With us. At Securian Financial, we understand that attracting top talent means offering more than just a job - it means providing a rewarding and fulfilling career. As a valued member of our high-performing team, we want you to connect with your work, your relationships and your community. Enjoy our comprehensive range of benefits designed to enhance your professional growth, well-being and work-life balance, including the advantages listed here:

Paid time off:

  • We want you to take time off for what matters most to you. Our PTO program provides flexibility for associates to take meaningful time away from work to relax, recharge and spend time doing what's important to them. And Securian Financial rewards associates for their service by providing additional PTO the longer you stay at Securian.

  • Leave programs: Securian's flexible leave programs allow time off from work for parental leave, caregiver leave for family members, bereavement and military leave.

  • Holidays: Securian provides nine company paid holidays.

Company-funded pension plan and a 401(k) retirement plan: Share in the success of our company. Securian's 401(k) company contribution is tied to our performance up to 10 percent of eligible earnings, with a target of 5 percent. The amount is based on company results compared to goals related to earnings, sales and service.

Health insurance: From the first day of employment, associates and their eligible family members - including spouses, domestic partners and children - are eligible for medical, dental and vision coverage.

Volunteer time: We know the importance of community. Through company-sponsored events, volunteer paid time off, a dollar-for-dollar matching gift program and more, we encourage you to support organizations important to you.

Associate Resource Groups: Build connections, be yourself and develop meaningful relationships at work through associate-led ARGs. Dedicated groups focus on a variety of interests and affinities, including:

  • Mental Wellness and Disability

  • Pride at Securian Financial

  • Securian Young Professionals Network

  • Securian Multicultural Network

  • Securian Women and Allies Network

  • Servicemember Associate Resource Group

For more information regarding Securian's benefits, please review our Benefits page.

This information is not intended to explain all the provisions of coverage available under these plans. In all cases, the plan document dictates coverage and provisions.

Securian Financial Group, Inc. does not discriminate based on race, color, religion, national origin, sex, gender, gender identity, sexual orientation, age, marital or familial status, pregnancy, disability, genetic information, political affiliation, veteran status, status in regard to public assistance or any other protected status. If you are a job seeker with a disability and require an accommodation to apply for one of our jobs, please contact us by email at , by telephone (voice), or 711 (Relay/TTY).

To view our privacy statement click here

To view our legal statement click here


Remote working/work at home options are available for this role.
Not Specified
SENIOR AWS DATA ENGINEER
✦ New
Salary not disclosed
Irving, Texas 11 hours ago

Visa Status: US Citizen or Green Card Only

Location: Irving, TX (Local Candidates Only)

Employment Type: Full-time / Direct Hire

Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)

***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***

***US Citizen or Green Card Only***

The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.

Responsibilities:

  • Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
  • Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
  • Responsible for data cleansing, standardization, and quality control
  • Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
  • Create data models and ETL procedures with traceability, data lineage and source control
  • Design and implement data integration and data quality framework
  • Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
  • Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
  • Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
  • Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
  • Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
  • Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.

Target Skills and Experience

  • Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
  • At least 10 years' experience designing and managing data pipelines, schema modeling, and data processing systems.
  • Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
  • Experience with SAP a plus
  • Proficient in Python, with a track record of solving real-world data challenges.
  • Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
  • Experience with Terraform or other infrastructure-as-code tools is a plus.
Not Specified
jobs by JobLookup
✓ All jobs loaded