Define Array In Data Structure And Algorithm Jobs Salary Jobs in Usa
39,081 positions found — Page 3
Doctor of Medicine | Psychiatry - General/Other
Location: San Antonio, TX
Employer:
Pay: Competitive weekly pay (inquire for details)
Start Date: ASAP
About the Position
LocumJobsOnline is working with to find a qualified Psychiatry MD in San Antonio, Texas, 78236!
This Job at a Glance
- Job Reference Id: ORD-209014-MD-TX
- Title: MD
- Dates Needed: ASAP and Ongoing
- Shift Type: Day Shift
- Assignment Type: Inpatient
- Call Required: No
- Board Certification Required: Yes
- Job Duration: Locums
The facility operates as a community mental health center featuring a specialized crisis stabilization unit. The center focuses on providing acute psychiatric care for individuals experiencing mental health emergencies in a structured inpatient setting.
About the Facility LocationThis Texas region offers diverse regional attractions including outdoor recreation opportunities and year-round activities. The area provides access to coastal destinations with beach activities and recreational areas, offering birding opportunities and regular community events including farmers markets and cultural experiences.
About the Clinician's WorkdayThe psychiatrist will provide comprehensive psychiatric care on an extended observation unit treating adult patients experiencing mental health crises. The clinician will work Monday through Friday from 8:00 AM to 5:00 PM, serving as the primary psychiatric provider on the unit with responsibility for crisis stabilization interventions. The position involves managing a census of 8-16 patients with an average length of stay of 24 hours, requiring coordination with nursing staff and conducting patient handoffs with a nurse practitioner.
Additional Job Details
- Case Load/PPD: 8-16
- Support Staff: Nurse practitioner and nursing staff
- Patient Population: All Ages
- Location Type: On-Site
- Government: No
- Shift Hours: Full time (40 hours)
- Cases Treated: Crisis Stabilization
- Average Length of Stay: 24 Hours
- Census: 8-16
Why choose ?
Our services are 100% free for clinicians and are designed for a seamless experience with every assignment:
- Precision job matching with proprietary algorithm
- Rapid credentialing with Axuall Digital Wallet
- Concierge support with a dedicated clinician deployment specialist
- Digital hub for assignment details
Contact:
About
The need has never been greater to connect great clinicians and great healthcare facilities. That’s what we do. Every day. We’re . We connect clients and clinicians to take care of patients. How do we do it? By doing it better than everyone else. Whether you’re looking for a locum tenens job or locum tenens coverage, our experienced agents have the specialized knowledge, know-how, and personal relationships to take care of you and your search.
provides comprehensive onboarding and optional 1099 financial consulting from a partner advisor.
We cover your malpractice insurance (A++) and provide assistance with credentialing, privileging, licensing, housing and travel.
Our agents have the specialized knowledge and personal connections to provide the best locum tenens experience and negotiate top pay on your behalf.
1705187EXPPLAT
Your role and responsibilities
About the Opportunity
IBM Consulting is seeking an accomplished Data & Analytics Associate Partner to accelerate our growth within the Industrial & Communications sectors. This executive role is responsible for shaping client vision, cultivating senior executive relationships, and developing data-driven solutions that enable clients to successfully navigate complex transformation programs.
You will bring together deep industry expertise and IBM’s portfolio of data, analytics, and AI capabilities to help organizations modernize their data ecosystems—migrating from legacy platforms to modern hybrid cloud architectures—while adopting next-generation analytics, GenAI, and agentic AI to strengthen decision-making and deliver measurable business and financial outcomes.
This role is ideal for a seasoned leader who integrates industry depth, consulting excellence, and technical thought leadership, has a strong understanding of competitive market dynamics, and consistently delivers high-impact transformation at scale.
Key Responsibilities
Market Leadership & Growth
Expand IBM’s Data & Analytics presence by identifying new market opportunities, developing differentiated solutions, and building a strong pipeline.
Engage senior client executives to understand strategic priorities and shape data transformation roadmaps aligned to their business and financial goals.
Lead end-to-end sales cycles, including solution definition, proposal leadership, financial structuring, and contract negotiation.
Strategic Advisory & Transformation Delivery
Advise C-suite leaders on strategies to their data estate modernization, advanced analytics, GenAI, and agentic AI to drive business performance.
Architect integrated solutions that include:
Migration from legacy data platforms to modern cloud-based architectures
Data engineering and Information governance
Business intelligence and advanced analytics
GenAI-powered and agentic AI-driven automation and decisioning
Lead complex transformation programs from discovery through delivery, ensuring measurable outcomes and client satisfaction.
Engagement Excellence & Financial Stewardship
Oversee multi-disciplinary delivery teams to ensure high-quality, consistent execution across all program phases.
Manage engagement financials, including forecasting, margin performance, and overall portfolio profitability.
Align right client technologies, industry expertise, and global delivery capabilities to maximize client value.
Practice Building & Talent Development
Recruit, mentor, and grow top-tier consultants, architects, and data specialists.
Build and scale capabilities in data modernization, cloud data engineering, analytics, GenAI, and emerging agentic AI techniques.
Contribute to practice strategy, offering development, and capability growth across the global Data & Analytics team.
Thought Leadership & Market Presence
Stay ahead of sector and technology trends, including cloud modernization, GenAI, agentic system design, regulatory changes, and evolving competitive dynamics.
Represent IBM at industry conferences, client events, webinars, and executive roundtables.
Create original thought leadership—articles, perspectives, point-of-views—that positions IBM as a leading advisor in data and AI-driven transformation.
This position can be preformed anywhere in the US.
"Leaders are expected to spend time with their teams and clients and therefore are generally expected to be in the workplace a minimum of three days a week, subject to business needs."
Required technical and professional expertise
Qualifications
12+ years of experience in consulting, data strategy, analytics, or digital transformation, with strong exposure to the Industrial or Communications sectors.
Hands-on experience modernizing data ecosystems, including migrating from legacy on-premise platforms to modern cloud-native or hybrid cloud architectures.
Deep expertise with major cloud platforms and their data/analytics stacks, including implementation experience with:
AWS (e.g., Redshift, S3, Glue, EMR, Athena, Lake Formation, Bedrock, SageMaker)
Microsoft Azure (e.g., Azure Data Lake, Synapse, Data Factory, Databricks on Azure, Fabric, Cognitive Services)
Google Cloud Platform (e.g., BigQuery, Cloud Storage, Dataflow, Dataproc, Vertex AI)
Experience designing and implementing end-to-end data pipelines, governance frameworks, and analytics solutions on one or more of these platforms.
Strong understanding of GenAI architectures, LLM integration patterns, vector databases, retrieval-augmented generation (RAG), and emerging agentic AI frameworks.
Proven track record of selling, structuring, and delivering large-scale data and AI transformation programs.
Robust technical and functional expertise in data engineering, cloud data platforms, analytics, AI/ML, information management, and governance.
Executive-level communication and presence, with demonstrated ability to influence senior stakeholders and convey complex topics through compelling narratives.
Financial management experience, including engagement economics, forecasting, margin optimization, and portfolio profitability.
Demonstrated leadership in building, scaling, and developing high-performing consulting and technical teams.
Preferred technical and professional experience
IBM is committed to creating a diverse environment and is proud to be an equal-opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender, gender identity or expression, sexual orientation, national origin, caste, genetics, pregnancy, disability, neurodivergence, age, veteran status, or other characteristics. IBM is also committed to compliance with all fair employment practices regarding citizenship and immigration status.
#J-18808-Ljbffr
Visa Status: US Citizen or Green Card Only
Location: Irving, TX (Local Candidates Only)
Employment Type: Full-time / Direct Hire
Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)
***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***
***US Citizen or Green Card Only***
The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.
Responsibilities:
- Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
- Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
- Responsible for data cleansing, standardization, and quality control
- Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
- Create data models and ETL procedures with traceability, data lineage and source control
- Design and implement data integration and data quality framework
- Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
- Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
- Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
- Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
- Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
- Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.
Target Skills and Experience
- Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
- At least 10 years' experience designing and managing data pipelines, schema modeling, and data processing systems.
- Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
- Experience with SAP a plus
- Proficient in Python, with a track record of solving real-world data challenges.
- Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
- Experience with Terraform or other infrastructure-as-code tools is a plus.
Title: Lead Software Engineer - AI Application Platform
Mode of interview 1 round in person
Location: Must be in Charlotte, NC to work Hybrid Model
Main Skill set: Python, AI and Angular
Description:
Lead Software Engineer - AI Application Platform
The Opportunity
We are seeking a Lead Software Engineer to guide the architectural development and execution of the client, a sophisticated AI-powered application generation platform. This role suits a proven technical leader with deep, hands-on expertise across the full software stack who finds enabling a team to build better software deeply satisfying.
You will shape critical systems, mentor senior and junior developers through complex technical decisions, conduct rigorous code reviews across multiple technology domains, and directly influence the platform's trajectory through strategic engineering leadership.
This is for someone who:
- Engages thoughtfully when a junior developer asks targeted architectural questions—because you see an opportunity to shape how someone thinks about systems
- Takes time to explain subtle type-safety issues in code review, understanding that feedback is a teaching moment
- Can present architecture clearly to executives and confidently explain both what we're building and why it matters
- Finds more energy in the code your team ships than in the code you write individually
- Has proven depth across the full stack and a track record of developing engineers into stronger contributors
This is not a single-language codebase. The role requires the ability to make informed decisions on TypeScript design patterns, Python FastAPI architecture, AWS security posture, and Terraform state management in context with one another.
The Platform Challenge
The client is fundamentally a Platform-as-a-Service (PaaS) for dynamic application generation. This differs from building a traditional SaaS product. Rather than building one application, you're building infrastructure that enables users to build their own applications.
What this means architecturally:
- Dynamic Content Generation at Scale: Unlike traditional development where code is fixed, AppGen generates JSON form schemas, validation rules, and UI layouts on demand. The FormBuilder component doesn't know what fields will exist until runtime. The layout engine renders user-designed screens from configuration, not hardcoded templates.
- Multi-Tenant Isolation & Data Segregation: Each user gets their own generated app, potentially deployed to their own AWS environment. The architecture must account for data isolation, namespace management, and cross-tenant security considerations.
- User-Defined Data Structures: Traditional applications are built with predetermined database schemas. AppGen works differently—form structures, field types, and validation rules emerge from user conversations with Claude. This brings engineering challenges: How do you safely execute validation logic that users define? When users modify existing forms that have thousands of submissions, how do you maintain backward compatibility? How do you version schemas?
- Content Rendering, Not Code Generation: Unlike traditional no-code platforms where users drag-and-drop to build, AppGen uses AI instead. Users chat with Claude, Claude generates a form schema, and your platform renders that schema reliably across diverse field types, validation patterns, and workflows. The system renders configurations for immediate use, rather than generating code for later deployment.
Experience that directly transfers:
- You've contributed to or led development of low-code/no-code platforms (visual builders, workflow engines, configuration-driven systems)
- You've worked on SaaS platforms with multi-tenant architecture and understand isolation strategies, rate limiting, and per-customer customization
- You've built dynamic rendering systems that handle unknown/arbitrary schemas at runtime
- You've addressed the unique challenges of treating data configurations as user-created content (form builders, report designers, automation workflows)
- You understand the difference between platform infrastructure and applications built on that infrastructure—and the architectural implications of each
Core Responsibilities
1. Technical Architecture & Systems Thinking (40%)
- Shape architectural decisions across the full stack: How should the component layer handle dynamically generated forms? What's the right approach to validate complex cross-field dependencies in the FormBuilder? What separation of concerns makes sense between the Generator Lambda and the Parent Backend?
- Guide architecture discussions: Help senior developers think through design trade-offs. Should we use NgRx or Angular signals for this feature? When does a new Lambda function become worthwhile given cold-start costs?
- Identify and address system-wide bottlenecks: Work across layers to improve performance. Explore Lambda cold-start optimization, RDS query efficiency, and DynamoDB access patterns.
- Establish patterns and guide consistency: Define coding conventions that work across Python, TypeScript, and Terraform. Help new team members understand the reasoning behind architectural choices.
- What this looks like in practice: You're able to justify architectural decisions with technical reasoning. When someone questions an approach, you can explain the trade-offs you considered. You can write code in multiple languages to validate an approach if needed.
2. Code Review & Technical Guidance (30%)
- Full-stack PR reviews: Review Python FastAPI endpoints and Angular components with equal depth, understanding how they interact.
- Deep technical review: Catch issues thoughtful code review can surface:
- RxJS Observable lifecycle and potential memory patterns in Angular
- Query efficiency and data loading patterns in SQLAlchemy
- Terraform module organization and state management implications
- Type safety and TypeScript coverage gaps
- AWS security and IAM configurations
- Educational feedback: Your code reviews help the team learn. When you identify an issue, reviewees understand not just what changed, but how to think about similar problems in the future.
- Define quality expectations: Work with the team to establish what \"production-ready\" means for this platform and support consistent application of those standards.
- What this requires: Experience reviewing code across teams and multiple languages. You know how to write feedback that resonates—clear, constructive, and focused on helping people improve.
3. Mentorship & Team Development (20%)
- Expand specialist capabilities: Help backend specialists learn to contribute to the forms-engine. Support frontend experts in understanding FastAPI patterns.
- Accelerate junior developers: Pair on complex problems. Explain the reasoning behind patterns like DataState. Connect architectural choices to implementation details and performance implications.
- Identify and address gaps: Recognize when someone is struggling with a technology and provide targeted support—training, pair programming, or guidance through architectural decisions.
- Create growth opportunities: Stretch the team into new areas. A backend engineer working on their first Terraform contribution. A frontend specialist implementing an AWS Lambda authorizer.
- What this requires: Genuine investment in people's growth. You've walked developers through major transitions (generalist to specialist, specialist to full-stack, or into new technology areas). You understand that team strength grows when individuals expand their capabilities.
4. Stakeholder Communication & Technical Leadership (10%)
- Explain to diverse audiences: Translate architectural choices and trade-offs for product managers, executives, and business stakeholders. Connect \"optimizing DynamoDB queries\" to \"improving form submission latency by 30%.\"
- Shape technical direction: Contribute the engineering perspective on feasibility, risk, and what unlocks future capabilities.
- Support release confidence: You understand the code changes, comprehend the risks, and know what to monitor. You can stand behind releases.
Required Qualifications
Technical Skills
Frontend (Production Experience)
- 5+ years of Angular (including handling version migrations, optimizing change detection, and guiding teams through reactive patterns)
- Strong TypeScript skills with generics, discriminated unions, and strict mode
- RxJS depth: You understand hot vs. cold observables, unsubscription patterns, and can identify potential memory issues in reviews
- NgRx state management: You've designed stores at scale, optimized selectors, and evaluated architectural implications
- CSS Grid & Responsive Design: You can assess component hierarchy and layout decisions
- Material Design: You've worked within it and know when and how to extend it
Backend (Production Experience)
- 5+ years of Python (async/await, type hints, data modeling)
- FastAPI production experience: session management, dependency injection, middleware
- SQL and ORMs (SQLAlchemy): You write efficient queries and review them critically
- AWS services: Understanding of Lambda behavior, IAM least-privilege patterns, VPC networking
- REST API design: Versioning, error handling, idempotency
- Testing frameworks: pytest, testing st
Remote working/work at home options are available for this role.
Job Summary:
Our client is seeking a Data Steward to join their team! This position is located Hybrid in Creve Coeur, Missouri.
Duties:
- Understand business capability needs and processes as they relate to IT solutions through partnering with Product Managers and business and functional IT stakeholders
- Participate in data scraping, data curation and data compilation efforts
- Ensure high quality of the data to end users
- Ensure high quality of the inhouse data via data stewardship
- Implement and utilize data solutions for data analysis and profiling using a variety of tools such as SQL, Postman, R, or Python and following the team’s established processes and methodologies
- Collaborate with other data stewards and engineers within the team and across teams on aligning delivery dates and integration efforts
- Define data quality rules and implement automated monitoring, reporting, and remediation solutions
- Coordinate intake and resolution of data support tickets
- Support data migration from legacy systems, data inserts and updates not supported by applications
- Partner with the Data Governance organization to ensure data is secured and access is being managed appropriately
- Identify gaps within existing processes and capable of creating new documentation templates to improve the existing processes and procedures
- Create mapping documents and templates to improve existing manual processes
- Perform data discoveries to understand data formats, source systems, etc. and engage with business partners in this discovery process
- Help answer questions from the end-users and coordinate with technical resources as needed
- Build prototype SQL and continuously engage with end consumers with enhancements
Desired Skills/Experience:
- Bachelor's Degree in Computer Science, Engineering, Science, or other related field
- Applied experience with modern engineering technologies and data principles, for instance: Big Data Cloud Compute, NoSQL, etc..
- Applied experience with querying SQL and/orNoSQL databases
- Experience in designing data catalogs, including data design, metadata structures, object relations, catalog population, etc.
- Data Warehousing experience
- Strong written and verbal communication skills
- Comfortable balancing demands across multiple projects / initiatives
- Ability to identify gaps in requirements based on business subject matter domain expertise
- Ability to deliver detailed technical documentation
- Expert level experience in relevant business domain
- Experience managing data within SAP
- Experience managing data using APIs
- Big Query experience
Benefits:
- Medical, Dental, & Vision Insurance Plans
- Employee-Owned Profit Sharing (ESOP)
- 401K offered
The approximate pay range for this position starting at $104,000 - $115,000+ Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
At KellyMitchell, our culture is world class. We’re movers and shakers! We don’t mind a bit of friendly competition, and we reward hard work with unlimited potential for growth. This is an exciting opportunity to join a company known for innovative solutions and unsurpassed customer service. We're passionate about helping companies solve their biggest IT staffing & project solutions challenges. As an employee-owned, women-led organization serving Fortune 500 companies nationwide, we deliver expert service at a moment's notice.
By applying for this job, you agree to receive calls, AI-generated calls, text messages, or emails from KellyMitchell and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy at
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Job Title: Health Data Services Strategy and Data Architect Manager
Location: San Francisco Bay Area
Work Mode: Hybrid Model – Onsite as Needed, at least 3-6 days a month
Duration: 3 months
Only Local candidates
Qualification:
• Healthcare analytics principles and performance measurement frameworks.
• Electronic Health Record systems and healthcare data structures.
• Data warehousing concepts and EPIC HB, PB, and Retail Pharmacy architecture necessary for internal and external reporting.
• Healthcare quality metrics and regulatory reporting requirements.
• Data governance principles and healthcare privacy regulations.
• Supervisory principles and budget preparation.
Knowledge, Skills, & Abilities
• Organize and evaluate healthcare analytics programs.
• Supervise and mentor professional and technical staff.
• Translate complex data into actionable insights.
• Balance competing priorities while maintaining alignment with organizational strategy.
• Facilitate collaborative decision-making and governance processes.
• Interpret and apply healthcare regulations within system configuration and documentation standards.
• Communicate effectively with executive and operational stakeholders.
• Prepare reports, policy recommendations, and budget justifications.
• Identify risks, propose solutions, and drive resolution of system or operational issues.
• Build and maintain effective working relationships across diverse service lines and departments.
Education:
• Any combination of education and experience that would likely provide the required knowledge, skills, and abilities is qualifying.
• A Bachelor's degree in public health, healthcare administration, data science, statistics, information systems, business administration, or related field; AND o Seven years of progressively responsible experience in healthcare analytics, healthcare IT, digital health, or performance reporting; INCLUDING o Three years of supervisory or leadership experience overseeing analysts or technical staff.
Duration: 6+ months
Location: 100% Remote
Job Overview
The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.
Key Responsibilities
- Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
- Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
- Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
- Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
- Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
- Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
- Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
- Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
- Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
- Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.
Required Skills & Qualifications
- 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
- Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
- Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
- Excellent communication and presentation skills, with experience translating technical concepts for business partners.
- Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
- Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
- Strong collaboration and stakeholder management skills across engineering, product, and business teams.
- Comfortable working in fast?moving environments and driving clarity through ambiguity.
Preferred Qualifications
- Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
- Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
- Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
- Background in solution architecture, customer engineering, or sales engineering.
- Experience developing demo environments, sample apps, or repeatable platform enablement assets.
- Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.
Job Title: Distribution and Marketing Data Product Manager
Division: Beazley Shared Services - Data Management
Location: Multiple Locations, US
Hybrid Role
Reports To: Head of Data Products
Key Relationships: Chief Data Office, Data Leadership Team, Data Owners, Distribution and Marketing, CRM, Data Governance and Quality, Data Stewards, Data Architects, Delivery Team members, Technology Team, Finance, Underwriting, Operations and other Business Stakeholders
Beazley:
Beazley is a global specialist insurance company with over 30 years' experience helping people,
communities, and businesses to manage risk all around the world. Our products are wide ranging from cyber & tech to marine, healthcare, financial institutions, and contingency, covering risks like the weather, film production or protection from deadly weapons.
We are a flexible and innovative employer offering a friendly, collaborative, diverse and inclusive work environment. We encourage applications from all backgrounds. Collaboration in office spaces is important and we use a hybrid approach with a minimum of 2 days in the office per week.
We have a wonderful mix of cultures, experiences, and backgrounds at Beazley with over 1500 of us working around the world. Employee's diversity, experience and passion allow us to keep innovating and moving forward, delivering the best. We hire people with wide perspectives, and we have set bold diversity targets as we work towards excellence.
Data @ Beazley:
Our Data team supports Beazley's vision by...
* Being bold through pioneering & championing an exciting vision of how people interact with data
* Facilitating innovation by leading the pace of change in data & analytics, and facilitating the latest capabilities and innovative technologies
* Doing the right thing by providing a controlled working data environment that allows all business domains to thrive independently
* Being the single source of truth for enterprise-wide reporting metrics and KPIs
Our Data team is located at multiple offices across UK, Europe and the US. The specified home office location options provide the best balance for being co-located with key Data Office colleagues and business stakeholders.
The Role:
Data is one of Beazley's greatest assets and this roles is critical to supporting our Distribution and Marketing insights, which includes Customer, Broker and Marketing data. We're seeking a strategic and technically savvy Data Product Manager to lead the strategy, development and evolution of data products and insights that empower our distribution and marketing teams. This role is critical to aligning our data, unlocking insights, and informing growth opportunities across our specialty portfolio. In this role, you will also work to mature data literacy and capabilities as Beazley undertakes a significant investment in modernization, enabling you to embed a culture of data excellence and innovation in our delivery.
Key Responsibilities:
Partner with the global Distribution and Marketing team to understand, prioritize and develop data products and insights that support their business strategy.
Build and own a roadmap to provide regular updates on delivery commitments for data products, insights, enhancements and queries.
Manage stakeholder relationships to support the growth strategy for Beazley customers, brokers, teams and products.
Produce insights and key data trendsthat highlight business performance, RoI, efficiencies and game-changing growth opportunities.
Inspire the adoption and use of insights to drive decisions in investment and operations that improve efficiency and drive growth by leading demonstrations and hands on training sessions.
Lead a team of Product Owners, Product Analysts, Business Analysts and a development team to deliver and maintain data products and insights; maintaining a backlog of work within Jira.
Represent the business in data governance discussions, escalating issues as appropriate.
Ensure that data product development considers policy, methodology and standards, and ensure these are adhered to during product development.
Evaluate the performance of your data product portfolio against KPIs defined by the business and provide feedback on the value delivered.
Proactively anticipate business needs and look for opportunities to bring innovation or new approaches into the user design, experience, product development and insights.
Relentlessly focus on the Distribution and Marketing team as a customer, delivering high quality data and insights that are clear and inspire action.
Partner with the Data Governance Group and CRM solution team (Customer Relationship Management) to drive improvements in our Customer and Broker data quality through MDM and other tools.
Provide leadership, direction, development and support to direct reports (including off-shore resources).
Essential Criteria:
Bachelor's degree in Business, Marketing, Data Science, Computer Science, Economics, Statistics or related field; Master's degree preferred
Proven experience in data product management, marketing analytics or distribution strategy, preferably in insurance or financial services
Experience working with data, building data models, and sharing insights
Skills and Abilities:
Strategic and curious with the ability to design and develop data and insights that support our Distribution and Marketing team's goals, planning, performance and incentives that drive growth
Understand the specialty insurance market, customer segmentation and distribution channels, with experience in North America, Lloyd's, Retail and Wholesale markets preferred
Ability to lead workshops that help your stakeholders identify data needs and articulate their desired user experience, with the ability to build dashboards preferred
Strong organization and communication skills with the ability to direct work, document requirements and present demos
Advanced technical skills with the ability to dive into the data, identify anomalies, and provide high quality, trusted data
Understanding of Specialty Insurance principles and key drivers to create opportunities, loyalty and growth
Knowledge and Experience:
Experience in Data Products, Data Analytics, Data Science, Statistics, Economics or related fields in Insurance, Financial or sales organizations preferred
Strong understanding of MDM and CRM systems and their use with Customer and Broker data
Proficiency in data visualization (Power BI), analytics platform (Snowflake), dashboard design and data storytelling
Experience working with insurance data, and in particular a strong understanding of pipeline intelligence for sales growth/ targeting and performance
Ability to use predictive modeling to drive an understanding of performance, customer behavior, and prospective renewals/ growth to help the Distribution Sales team focus on the best opportunities
Experience managing relationships and teams of stakeholders, business analysts, data analysts, data architects, data modelers, data engineers and testers using agile processes
Skills in data engineering technologies like Kafka, Snowflake / Snowpark, DataBricks, Jira and Agile principles
Experience in managing and manipulating large internal and external datasets
Knowledge of relational and dimensional database structures, theories, principles, and practices
Driven and proven team player with ability to work with all levels in a highly intellectual, collaborative, and fast paced environment
Excellent communication skills, with the ability to tailor them appropriately for different audiences, technical backgrounds, and seniority
Who We Are:
Beazley is a specialist insurance company with over 30 years' experience helping people, communities and businesses to manage risk all around the world. Our mission is to inspire our clients and people with the confidence and freedom to explore, create and build - to enable businesses to thrive. Our clients want to live and work freely and fully, knowing they are benefitting from the most advanced thinking in the insurance market. Our goal is to become the highest performing sustainable specialist insurer.
Our products are wide ranging, from cyber & tech insurance to marine, healthcare, financial institutions and contingency; covering risks such as the weather, film production or protection from deadly weapons.
Our Culture
We have a wonderful mix of cultures, experiences, and backgrounds at Beazley with over 2,000 of us working around the world. Employee's diversity, experience and passion allow us to keep innovating and moving forward, delivering the best. We are proud of our family-feel culture at Beazley that empowers our staff to work from when and where they want, in an adult environment that is big on collaboration, diversity of thought and personal accountability. Our three core values inspire the way we work and how we treat our people and customers.
- Be bold
- Strive for better
- Do the right thing
Upholding these values every day has enabled us to become an innovative and responsive organization in touch with the changing world around us - our ambitious inclusion & diversity and sustainability targets are testament to this.
We are a flexible and innovative employer offering a friendly, collaborative, and inclusive working environment. We actively encourage and expect applications from all backgrounds. Our commitment to fostering a supportive and dynamic workplace ensures that every employee can thrive and contribute to our collective success.
Explore a variety of networks to assist with professional and/or personal development. Our Employee Networks include:
- Beazley RACE - Including, understanding and celebrating People of Colour
- Beazley SHE - Successful, High potential, Empowered women in insurance
- Beazley Proud - Our global LGBTQ+ community
- Beazley Wellbeing - Supporting employees with their mental wellbeing
- Beazley Families - Supporting families and parents-to-be
We encourage internal career progression at Beazley, giving you all the tools you need to drive your own career here, such as:
- Internal Pathways (helping you grow into an underwriting role)
- iLearn (our own learning & development platform)
- LinkedIn Learning
- Mentorship program
- External qualification sponsorship
- Continuing education and tuition reimbursement
- Secondment assignments
The Rewards
- The opportunity to connect and build long-lasting professional relationships while advancing your career with a growing, dynamic organization
- Attractive base compensation and discretionary performance related bonus
- Competitively priced medical, dental and vision insurance
- Company paid life, and short- and long-term disability insurance
- 401(k) plan with 5% company match and immediate vesting
- 22 days PTO (prorated for 1st calendar year of employment), 11 paid holidays per year, with the ability to flex the religious bank holidays to suit your religious beliefs
- Up to $700 reimbursement for home office setup
- Free in-office lunch, travel reimbursement for travel to office, and monthly lifestyle allowance
- Up to 26 weeks of fully paid parental leave
- Up to 2.5 days paid annually for volunteering at a charity of your choice
- Flexible working policy, trusting our employees to do what works best for them and their teams
Salary for this role will be tailored to the successful individual's location and experience. The expected compensation range for this position is $130,000-$150,000 per year plus discretionary annual bonus.
Don't meet all the requirements? At Beazley we're committed to building a diverse, inclusive, and authentic workplace. If you're excited about this role but your experience doesn't perfectly align with every requirement and qualification in the job specification, we encourage you to apply anyway. You might just be the right candidate for this, or one of our other roles.
We are an equal opportunities employer and as such, we will make reasonable adjustments to our selection process for candidates that indicate that, owing to disability, our arrangements might otherwise disadvantage them. If you have a disability, including dyslexia or other non-visible ones, which you believe may affect your performance in selection, please advise us in good time and we'll make reasonable adjustments to our processes for you.
Company Description
Press Ganey is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.
Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.
Our Mission:
We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.
Our Values:
To put Human Experience at the heart of organizations so every person can be seen and understood.
Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
Better together:We check our egos at the door. We work together, so we win together.
We are seeking an experienced Staff Data Engineer to join our Unified Data Platform team. The ideal candidate will design, develop, and maintain enterprise-scale data infrastructure leveraging Azure and Databricks technologies. This role involves building robust data pipelines, optimizing data workflows, and ensuring data quality and governance across the platform. You will collaborate closely with analytics, data science, and business teams to enable data-driven decision-making.
Duties & Responsibilities:
- Design, build, and optimizedata pipelinesand workflows inAzureandDatabricks, including Data Lake and SQL Database integrations.
- Implement scalableETL/ELT frameworksusingAzure Data Factory,Databricks, andSpark.
- Optimize data structures and queries for performance, reliability, and cost efficiency.
- Drivedata quality and governance initiatives, including metadata management and validation frameworks.
- Collaborate with cross-functional teams to define and implementdata modelsaligned with business and analytical requirements.
- Maintain clear documentation and enforce engineering best practices for reproducibility and maintainability.
- Ensure adherence tosecurity, compliance, and data privacystandards.
- Mentor junior engineers and contribute to establishingengineering best practices.
- SupportCI/CD pipeline developmentfor data workflows using GitLab or Azure DevOps.
- Partner with data consumers to publish curated datasets into reporting tools such asPower BI.
- Stay current with advancements inAzure, Databricks, Delta Lake, and data architecture trends.
Technical Skills:
- Advanced proficiency inAzure 5+ years(Data Lake, ADF, SQL).
- Strong expertise inDatabricks (5+ years),Apache Spark (5+ years), andDelta Lake (5+ years).
- Proficient inSQL (10+ years)andPython (5+ years); familiarity withScalais a plus.
- Strong understanding ofdata modeling,data governance, andmetadata management.
- Knowledge ofsource control (Git),CI/CD, and modern DevOps practices.
- Familiarity withPower BIvisualization tool.
Minimum Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Science, or related field.
- 7+ yearsof experience in data engineering, with significant hands-on work incloud-based data platforms (Azure).
- Experience buildingreal-time data pipelinesand streaming frameworks.
- Strong analytical and problem-solving skills.
- Proven ability tolead projectsand mentor engineers.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Master's degree in Computer Science, Engineering, or a related field.
- Exposure tomachine learning integrationwithin data engineering pipelines.
Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.
Additional Information for US based jobs:
Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.
Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.
The expected base salary for this position ranges from $110,000 to $170,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.
All your information will be kept confidential according to EEO guidelines.
Our privacy policy can be found here:legal-privacy/