Cloudera Data Platform Cdp Jobs in Usa
12,118 positions found — Page 4
AI Data & Python Tools Engineer
We're seeking an AI Data and Python Tools Engineer to develop and deploy intelligent tools that leverage big data infrastructure and modern AI architecture. This role combines strong software engineering fundamentals with the ability to build production-ready AI applications at speed, including integration with Model Context Protocol (MCP) systems.
Responsibilities:
- Develop and deploy AI-powered full-stack applications using Python, React, and modern machine learning frameworks
- Design and streamline data pipelines, train and validate ML models, and implement robust evaluation methods
- Collaborate with cross-functional teams to solve complex problems and integrate scalable, cloud-based AI solutions
- Rapidly prototype, test, and iterate on AI tools with a strong focus on performance, flexibility, and scalability
- Maintain clear technical documentation, perform code reviews, and support the full software development lifecycle
Software Engineering & AI/ML Data, Tools Development
- 3+ years of Python Development with a background in back end services and data processing
- Exposure to AI/ML algorithms
- Familiarity with ML frameworks (TensorFlow, PyTorch, scikit-learn)
- Understanding of LLMs, vector databases, and retrieval systems
- Experience with Model Context Protocol (MCP) integration and server development
Big Data & Cloud Infrastructure
- Knowledge of building and deploying cloud based applications
- Hands-on experience with cloud data platforms (AWS/GCP/Azure)
- Proficiency with big data technologies (Spark, Kafka, or similar streaming platforms)
- Experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes
- Knowledge of containerization (Docker/Kubernetes) and infrastructure as code
*Preferred Experience
- Experience building web applications with modern frameworks (React, Vue, or Angular)
- API development and integration experience
- Basic UX/UI design sensibilities for internal tooling
- Experience with real-time data processing and analytics
- Background in building developer tools or internal platforms
- Familiarity with AI/ML operations (MLOps) practices (Experience using airflow)
- Experience building MCP servers and integrating with AI assistants
- Knowledge of structured data exchange protocols and API design for AI systems.
Type: Full Time
Location: Austin, TX or Cupertino, CA (Monday- Friday onsite)
*Relocation assistance can be offered based on individual needs and circumstances*
Title:Sr. Manager Data Governance
Location: Richardson, TX Hybrid
Duration: 6 months possibility of FTE conversion yes
JOB SUMMARY
This position incubates and establishes a leading-edge global Data Governance function to support business segments, corporate functions and the Digital & Technology stakeholders. The responsibility includes
- Liaise directly with clients and account teams to provide strategic direction on implementation of data governance programs, best practices, adoption of standards, mast data management, and data quality improvement while leveraging leading-edge data governance tools and technology.
- Collaborate with and manage highly performing data governance and data management professionals that support occupier clients and account teams.
- Provide support on data strategy execution in the adoption of data products including enterprise data platform that provides game-changing analytics in the CRE industry.
- Serve as the data governance champion of strategic data products and supporting metadata and reference data.
- Implement and support data ownership and stewardship programs for stakeholders across the business to ensure that account teams adopt improved data governance and management practices.
ESSENTIAL DUTIES AND RESPONSIBILITIES
- Participate in the strategy, planning, and execution for Enterprise Data Governance at, focusing on Building Operations & Experience (BOE) business segment. Ensure the company has urgency, sensitivity and thought leadership for competitive capabilities around data.
- Demonstrated leadership experience in a large, complex, global organization, including the ability to effectively work and communicate across organizational lines. Ensure business stakeholder understanding, alignment and commitment to the objectives of the data governance and management program(s). Be the champion and evangelist for data, the business value, and the potential innovations. Be the trusted advisor to senior leadership and peers.
- Demonstrated experience in building relationships and leading high-performing teams with top talents around the world. Build a high performance environment and execute a people strategy that attracts, retains, develops and motivates their team by fostering an inclusive work environment, communicating vision/ values/ business strategy and managing succession and development planning for the team.
- Collaborate with partners across business segments/ business lines, regions and accounts to develop consistent data governance capabilities at all levels, influencing decisions relating to policy, practices, supporting technology, and talent development.
- Establish leading data management practices and shared services relating to data quality, data provisioning, metadata, lineage, reference data, issue management and change management.
- Implement data governance as commodity services that could be leveraged by various clients in different industries. Understand clients' appetite and risk culture in day-to-day support activities and decision-making.
- Establish account team data governance programs. Define data domains and implement business oversight via essential data governance organizations and RACI (i.e. central data governance function, Data Ownership and Stewardship Program, etc.). Establish data standards, policies and controls. Design and implement the framework, including associated processes, necessary to sustain a data control environment. Monitoring compliance with data policies and standards
- Establish account team and cross-account data quality framework necessary to enable data quality reporting, issue identification, remediation and tracking, ultimately ensuring trust and confidence in data across domains.
- Guide the client accounts to adopt the strategic data products including existing account migrations and new account transitions. Manage data to support and its clients' business intelligence and scale appropriately with business growth.
- Experience in leading and driving leading-edge data innovation initiatives including big data, cloud computing, IoT, data virtualization and federation, etc., is a plus.
- Create and implement strategic approaches, plans, timelines, preparation of business cases to ensure expedited handling of client data protection, and other data compliance and security requirements.
- Develop and implement metrics needed to monitor/ report on data governance and data management progress
- Develop communication approaches and change management strategies; determines presentation focus and emphasis and prepares board-level presentations.
- Performs other duties as assigned.
SUPERVISORY RESPONSIBILITIES
Manages the planning, organization, and controls for a major functional area or department. Position will be responsible for managing direct reports across the Americas region and working with peers across all regions, requiring flexibility in schedule. May also be responsible for matrix reports. This position requires subordinates' recommendations for staff recruitment, selection, promotion, advancement, corrective action and termination. Effectively recommends same for direct reports to next level management for review and approval. Monitors appropriate staffing levels and reports on utilization and deployment of human resources. Leads and supports staff in areas of staffing, selection, training, development, coaching, mentoring, measuring, appraising, and rewarding performance and retention. Leads by example and models behaviors that are consistent with the company's values.
QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required.
Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
EDUCATION and EXPERIENCE
Bachelor's degree (BA/BS) from four-year college or university and a minimum of eight years of related experience and/or training, including five years of experience at the management level.
- 5 or more years of progressively responsible management positions in complex organizations required. Demonstrated success with high visibility projects, leaders in technology use and development, change management, budget and business case development and staff development.
- 5 or more years of related experience in related industry; commercial real estate management preferred.
- 7 or more years of data management related experience such as data analysis, data governance, enterprise information management, data modeling, and data quality management. analytics experience desired, i.e., data visualization, data analytics, data mining, business intelligence, etc.
- Candidates must have experience working in large organizations with geographically dispersed teams and complex technical environments.
- Experience in dealing with internal and external customers, service providers and vendors. Must be able to manage competing priorities. Needs to be resilient; resolving conflicts quickly to achieve desired business results.
- Bachelor's degree in business administration, Information Management, MIS, Business Intelligence and Data Science, Library Science, Computer Science or related fields; advanced degree preferred.
CERTIFICATES and/or LICENSES
None
COMMUNICATION SKILLS
- Ability to comprehend, analyze, and interpret the most complex business documents. Ability to respond effectively to the most sensitive issues. Ability to write reports, manuals, speeches and articles using distinctive style. Ability to make effective and persuasive presentations on complex topics to employees, clients, top management and/or public groups. Ability to motivate and negotiate effectively with key employees, top management, and client groups to take desired action.
- Ability to establish and maintain a high level of customer trust and confidence in the overall information and analytics space
- Excellent oral, written, and presentation communication skills. Strong negotiation and group facilitation skills; ability to move a process forward, while meeting the needs of a variety of clients.
- Excellent collaboration, influence and leadership skills. Ability to work with various levels of peers including analysts, developers and executives regarding complex business and data related issues.
- Relationship management skills that include excellent listening and consultative capability, the ability to influence and negotiate with business and technology partners to drive change, and the ability to take a broad perspective and make key connections
FINANCIAL KNOWLEDGE
- Requires basic knowledge of financial terms and principles.
- Participates in complex financial/business analysis and report reviews prepared peers or leaders.
- Manages to and oversees department budget.
REASONING ABILITY
- Ability to solve advanced problems and deal with a variety of options in complex situations. Requires expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based environment. Draws upon the analysis of others and makes recommendations that have a direct impact on the company.
- Understanding of global organizational design and the ability to shape and drive large-scale, cross-functional programs around people, technology, processes, and tools.
- Demonstrated ability to balance long-term strategy with quick wins.
- Demonstrated ability for strategic influencing and education of cross-functional stakeholders about the strategic importance and value of data governance
- Excellent managerial skills; collaborative, imaginative, resourceful, reliable, technically savvy.
- Superior analytical and creative problem-solving skills. Demonstrated successes in data analysis, drawing conclusions and improvement. Apply listening and consultative skills to understand business needs; be able to interpret requirements, identify impacts and analyze problems to determine impacts to business processes across the organization.
- Ability to work well under deadlines, ability to work in a multi-tasking production environment to make good judgments about competing priorities.
- Ability to tell a story to explain or sell a concept.
OTHER SKILLS and/or ABILITIES
- Utilizes an entrepreneurial approach and develops innovative solutions.
- Ability to write business cases, process maps, presentation materials and articles using distinctive style.
- Ability to make effective and persuasive presentations on complex topics across various levels of leadership
- Expert level analytical and quantitative skills with proven experience in developing strategic solutions for a growing matrix-based multi-industry sales environment.
- Ability to use strong conceptual and analytical skills to generate insights and recommendations.
- Demonstrated information management and quantitative skills, including working knowledge of IT infrastructure, various technologies/ platforms, and aligned vendor solutions with enterprise strategic priorities.
- Experience managing small to mid-size teams and delivering results.
- Thorough knowledge of cutting-edge data management tools, industry advances, etc.
- Superior project management/ consulting and leadership skills. Demonstrated ability to facilitate complex, mission critical projects and to develop, participate in and guide multi-disciplinary work teams. Manage task timelines and deliverable schedules and share concerns about deliverables, timelines, and issues with Data Governance services or deliverables.
- Superior ability to manage, manipulate and analyze raw data, draw conclusions, and develop actionable recommendations using technology. Articulate the issues and resolutions via business-friendly communications. Serve as primary day-to-day contact for regional data management issues.
- Advanced understanding of data quality management. Knowledge of data governance and how it impacts business processes.
- Knowledge of master data management in a global environment, including data lifecycle and maintenance processes.
- Skills in MS Visio, Word and PowerPoint is a plus.
- Experience with reference data management tools at including Collibra, MS Excel, SQL query, etc., is a plus.
- Software development lifecycle knowledge, with background in agile philosophies and
They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).
This is not an SSIS-only role.
The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.
This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.
The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.
This requires architectural thinking and hands-on technical execution.
Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.
Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.
Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.
6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.
6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.
6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.
We are seeking an experienced Data Engineer to design, develop, and maintain scalable data solutions that support business analytics and operational reporting. The ideal candidate will have strong expertise in SQL, data modeling, and cloud-based data platforms, with the ability to build efficient data pipelines and optimize database performance.
*The schedule is 4 days in the office in Montvale, NJ and 1 day remote.
Required Qualifications:
Education
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
Technical Skills
- Advanced SQL expertise with 5+ years of experience, including window functions, common table expressions (CTEs), and query optimization.
- Strong knowledge of relational database management systems (RDBMS) and data modeling principles.
- Experience working with cloud-based data platforms, particularly Azure Data Services and modern data warehouse technologies.
- Proficiency in Python for scripting, automation, and data manipulation.
- Experience developing and maintaining ETL processes using tools such as SSIS or Azure Data Factory.
Professional Skills
- Strong analytical and problem-solving abilities.
- Excellent communication skills with the ability to collaborate across technical and non-technical teams.
- Ability to manage multiple priorities in a fast-paced environment.
Preferred Qualifications
- Experience working with NoSQL databases such as Cosmos DB or MongoDB.
- Familiarity with big data frameworks including Apache Spark or Kafka.
- Relevant certifications such as Microsoft Certified: Azure Data Engineer Associate or Google Professional Data Engineer.
Tools & Technologies
- SQL development environments such as DBeaver and SSMS
- Cloud management consoles
- Git for version control
- Jira for project and workflow management
- SSIS and related ETL technologies
Industry
- Leasing
Clinical Data Scientist
Redwood City, CA (Hybrid potentially remote options)
Salary: $150,000-$190,000
No Sponsorship Available
About the Role
We are seeking a Clinical Data Scientist to play a pivotal role in transforming complex clinical datasets into high‑quality, analysis‑ready outputs used to support clinical trials and real‑world evidence initiatives.
In this role, you will operate at the intersection of data science, clinical research, and statistical programming. You’ll be responsible for validating, cleaning, and structuring data originating from multiple sources—including expert manual abstraction teams, AI‑assisted pipelines, EMR feeds, and EDC systems.
You’ll collaborate closely with Clinical Operations, Data Engineering, and AI/ML teams to ensure accuracy, traceability, and compliance across every dataset delivered internally or externally.
This role is ideal for someone who is detail‑obsessed, technically versatile, and passionate about elevating the quality of clinical data used in drug development.
What You’ll Do
- Convert raw, manually abstracted, and AI‑processed datasets into standardized formats (e.g., CDISC SDTM/ADaM) or client‑specific data models.
- Ensure outputs meet quality, compliance, and traceability standards.
- Generate TLFs (Tables, Listings, Figures) for clinical reports and interim analyses using SAS, R, or Python.
- Perform robust data cleaning and QC checks.
- Investigate anomalies and troubleshoot issues across the data pipeline.
- Distinguish between upstream extraction issues and true clinical variations.
- Partner with Data Platform and AI teams to automate cleaning scripts, validations, and workflow logic.
- Serve as an early user and feedback partner for internal data tools.
- Maintain documentation for data derivations, specifications, and validation logic (e.g., Define.xml, Reviewers Guides).
- Support compliance and regulatory submission needs.
- Complete internal and external analysis requests to support clinical insights, client value, and platform performance.
- Apply HIPAA-aligned data safeguards and adhere to best practices across privacy, security, and data governance.
What You Bring
- Education:
- BSc/MSc in Statistics, Mathematics, Computer Science, Life Sciences, or related field.
- Experience:
- 2–5+ years in clinical data science, statistical programming, or data management in pharma/biotech.
- Technical Strengths:
- SAS, R, Python, SQL
- Experience with Git/version control preferred
- Industry Knowledge:
- Familiarity with clinical trial workflows
- Strong understanding of CDISC SDTM/ADaM
- Oncology endpoints (RECIST, survival) and RWD experience is a plus
- Data Wrangling:
- Comfort “stitching together” messy, real‑world clinical datasets
- Experience with unstructured text or NLP outputs is desirable
- Soft Skills:
- Exceptional attention to detail
- Clear, structured communicator
- Proactive, self‑directed, collaborative
DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.
You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.
At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.
Key Responsibilities
Enterprise Reporting (Hands-On Development)
- Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
- Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
- Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
- Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
- Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
- Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
- Support reporting scalability as our client grows (new factories, new business units, new product lines).
Enterprise Reporting Standards & Governance
- Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
- Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
- Review and approve reporting change requests, data model modifications, and access requests.
- Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
- Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.
Enterprise Data Architecture
- Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
- Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
- Build and govern the centralized semantic model that powers reporting across the company.
- Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
- Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.
Leadership & Collaboration
- Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
- Communicate complex architectural concepts in clear, business-friendly terms.
- Lead cross-functional initiatives that require unified data structures or scalable reporting.
- Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.
Ideal Candidate Profile
- Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
- Track record of establishing and enforcing enterprise reporting standards and governance.
- Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
- Able to operate as both an individual contributor and a strategic leader.
- Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
- Comfortable influencing architectural decisions and guiding technical execution.
- Strong command of foundational tools and languages such as:
- DAX
- Power Query / M
- SQL
- Fabric pipelines / ETL tooling
- Experience with automation and AI-assisted analytics workflows.
Title : Data QA Engineer
Location: Minneapolis , Dallas , Atlanta (Onsite)
Job Type : Contract
Exp : 8-15 Years
Key Responsibilities:
- Design, build, and maintain automated data quality frameworks to validate accuracy, completeness, consistency, and timeliness of data.
- Develop automation scripts using Python/SQL to test data pipelines, ETL/ELT processes, and analytics workflows.
- Implement data quality checks and monitoring within Azure-based data platforms.
- Work extensively with Azure services (ADF, ADLS, Synapse) and Databricks for large-scale data processing.
- Integrate data quality validations into CI/CD pipelines and support proactive issue detection.
- Perform root cause analysis for data issues and collaborate with data engineering, analytics, and business teams to resolve them.
- Define and enforce data quality standards, metrics, and SLAs.
Required Skills & Qualifications:
- Strong experience (8–15 years) in data engineering, data quality, or data automation roles.
- Hands-on expertise with Azure data ecosystem and Databricks.
- Strong programming skills in Python and SQL.
- Experience building automated data validation and reconciliation frameworks.
- Solid understanding of data warehousing, data lakes, and distributed data processing.
- Familiarity with DevOps/CI-CD practices for data platforms.
Preferred Skills:
- Experience with data observability or data quality tools.
- Exposure to cloud-scale analytics and performance optimization.
- Strong communication and stakeholder management skills.
Purpose
As a foundational member of our AI Center of Excellence, you will serve as the data science lead for enterprise AI initiatives, architecting and deploying AI solutions that make a meaningful impact across our national retail footprint. The Data Scientist Lead will work with other members of the AI COE and business leadership to identify and execute the highest-impact initiatives, own the data science lifecycle, from hypothesis and feature engineering to model validation and performance monitoring; bridging the gap between cutting-edge AI capabilities and practical business applications.
This role requires a rare blend of deep technical mastery and sharp business acumen, to translate complex data into actionable insights and intelligent systems that enhance customer experience, optimize commercial operations, and enable smarter decision-making at scale. The ideal candidate is passionate about retail innovation, thrives in ambiguity, and is energized by the opportunity to shape AI strategy from the ground up.
You’ll Be Successful With
- Bachelor's degree in Data Science, Computer Science, Statistics, Mathematics, or a related quantitative field. Master's degree preferred.
- 5+ years in Data Science or Applied AI roles, preferably in retail or a customer-facing industry.
- Proven track record of moving models from development into production which deliver measurable impact to the business.
- Expert proficiency in Python and SQL. Comfortable with version control.
- Expertise in supervised/unsupervised learning and modern frameworks (e.g. scikit-learn, PyTorch, or TensorFlow).
- Hands-on experience building with LLMs, RAG architectures, prompt engineering, or AI agent development.
- Experience deploying and monitoring models at scale. Familiar with cloud data platforms (e.g. Databricks or Snowflake) and cloud infrastructure (Azure experience a plus).
- You understand how to apply AI to commercial problems such as demand forecasting, customer- and associate- facing applications, personalization, labor optimization, etc.
- The ability to translate "black box" model outputs into clear, actionable insights for business leadership.
- Proven ability to communicate complex technical concepts to non-technical stakeholders and influence decision-making through data-driven storytelling.
- Strong intellectual curiosity with a bias toward action and continuous improvement.
- Demonstrated ability to work autonomously while collaborating effectively within cross-functional teams.
Your Day Consists Of
- Buy vs Build leadership. Evaluate 3rd-party AI platforms and partnerships. Serve as the technical lead in vetting vendor methodologies, guiding the in-house vs external decisioning.
- Partner with business leadership to identify high-value AI opportunities, defining technical specifications and success metrics that align with enterprise strategy.
- Design, develop, and deploy custom machine learning models (impacting merchandising, commercial, labor, digital, etc.) within the Databricks environment.
- Lead experimentation design, including A/B testing and causal inference, to validate model performance and measure true incremental business lift (ROI).
- Collaborate with data engineers on feature development and with AI developers to wrap models into production-grade APIs and applications.
- Partner closely with the Customer Insights team to ensure model outputs are optimized for consumption within Power BI/DAX, turning complex predictions into actionable insights.
- Establish and enforce MLOps standards across the org, including model versioning, automated retraining, and drift monitoring.
- Serve as the primary ML subject matter expert for the broader Data Science & Insights team. Provide active coaching to analytics leaders, empowering them to identify ML-applicable use cases and effectively incorporate predictive outputs into their own functional workstreams.
- Contribute to the enterprise AI governance framework, ensuring ethical AI practices, data privacy compliance, and model transparency.
- Present findings, recommendations, and project updates to leadership and cross-functional partners in clear, compelling formats.
- Be compliant with all appropriate privacy and security protocols.
Working Conditions (travel, hours, environment)
- Limited travel required including air and car travel
- While performing the duties of this job, the employee is occasionally exposed to a warehouse environment and moving vehicles. The noise level in the work environment is typically quiet to moderate.
Physical/Sensory Requirements
Sedentary Work – Ability to exert 10 - 20 pounds of force occasionally, and/or negligible amount of force frequently to lift, carry, push, pull or otherwise move objects. Sedentary work involves sitting most of the time but may involve walking or standing for brief periods of time.
Benefits & Rewards
- Bonus opportunities at every level
- Non-traditional retail hours (we close at 7p!)
- Career advancement opportunities
- Relocation opportunities across the country
- 401k with discretionary company match
- Employee Stock Purchase Plan
- Referral Bonus Program
- 80 hrs. annualized paid vacation (full-time associates)
- 4 paid holidays per year (full-time hourly store associates only)
- 1 paid personal holiday of associate’s choice and Volunteer Time Off program
- Medical, Dental, Vision, Life and other Insurance Plans (subject to eligibility criteria)
Equal Employment Opportunity
Floor & Decor provides equal employment opportunities to all associates and applicants without regard to age, race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender, gender identity, disability, veteran status, genetic information, ethnicity, citizenship, or any other category protected by law.
This policy applies to all areas of employment, including recruitment, testing, screening, hiring, selection for training, upgrading, transfer, demotion, layoff, discipline, termination, compensation, benefits and all other privileges, terms and conditions of employment. This policy and the law prohibit employment discrimination against any associate or applicant on the basis of any legally protected status outlined above.
$100,000 - $120,000
Location
Hybrid
Summary
Join a growing reinsurance organization as a Junior Data Analyst supporting capital modeling and portfolio analysis. You will play a key role in running complex capital models, preparing and validating data, and presenting insights to underwriting and actuarial teams. This position offers a unique opportunity to develop technical expertise within a lean yet expanding company, contributing to impactful financial risk assessments and reporting.
Requirements
- Bachelor’s degree in Mathematics, Finance, Economics, Data Analytics, Actuarial Science, or related field
- 2–4 years of experience in an analytical role, ideally within insurance, reinsurance, or financial services
- Strong proficiency in Excel, including formulas, modeling, and basic to intermediate VBA
- Experience working with structured datasets and familiarity with data platforms like Palantir Foundry or similar tools
- Ability to validate, troubleshoot, and ensure accuracy of analytical outputs
- Excellent written and verbal communication skills
- Run new and renewal reinsurance opportunities through existing capital models
- Make targeted adjustments to Excel/VBA models based on guidance from senior team members
- Validate and reconcile model outputs, perform sensitivity reviews, and document results
- Ingest, clean, and prepare data using Palantir or similar platforms to ensure data integrity
- Build clear reports and visualizations to communicate results effectively
- Present findings to underwriting, actuarial, and finance stakeholders, translating technical insights into business relevance
- Support ad-hoc analysis and projects aimed at improving capital efficiency and portfolio performance
Very lean but growing reinsurance company, offering opportunities for career growth, skill development, and active contribution to impactful financial projects.
Senior Data Modeler
Hybrid 3-4 days onsite
Location: Phoenix, Arizona
Salary: $130,000 - $150,000 base
A large, operationally complex organization is undergoing a major modernization of its data platform and is building a new, cloud-native analytics foundation from the ground up. This is a greenfield opportunity for a senior-level data modeler to establish best practices, influence architecture, and help shape how data is organized and used across the business.
This role sits at the center of a multi-year transformation focused on modern analytics, scalable data products, and strong collaboration between data and business teams.
What You’ll Be Working On
- Designing and implementing enterprise data models across conceptual, logical, and physical layers
- Establishing Medallion architecture patterns and reusable modeling assets
- Building dimensional and semantic models that support analytics and reporting
- Partnering closely with domain experts and functional leaders to translate business needs into data structures
- Collaborating with data engineers to align models with ELT pipelines and analytics frameworks
- Helping define modeling standards and upskilling senior engineers in modern data modeling practices
- Contributing hands-on to data engineering work where needed (SQL, transformations, optimization)
- Proactively identifying analytics opportunities and recommending data structures to support them
This role is roughly 40% data modeling, 30% hands-on engineering, and 30% cross-functional collaboration.
Must-Have Experience
- Strong, hands-on experience with data modeling (dimensional, canonical, semantic)
- Deep understanding of Medallion architecture
- Advanced SQL and experience working with a modern cloud data warehouse
- Experience with dbt for transformations and modeling
- Hands-on experience in cloud-native data environments (AWS preferred)
- Ability to work directly with business stakeholders and explain technical concepts clearly
- Experience collaborating closely with data engineers on execution
Nice to Have
- Python experience
- Familiarity with Informatica or reverse-engineering legacy data models
- Exposure to streaming or near-real-time data pipelines
- Experience with visualization tools (tool choice is flexible)
Who Will Thrive in This Role
- A senior individual contributor who enjoys building from scratch
- Someone who can act as a modeling expert and mentor in an organization formalizing this practice
- Comfortable working in ambiguity and taking initiative
- Strong communicator who enjoys partnering with both technical and non-technical teams
- Equally comfortable discussing business concepts and physical data models
Why This Role Is Unique
- Greenfield data modeling initiative with real influence
- Opportunity to define standards that will be used across the organization
- Work on large-scale, real-world operational and analytical data
- High visibility within a growing data organization
- Flexible work setup for individual contributors
If you’re excited about shaping a modern data foundation and want to be the person who defines how data is modeled, understood, and used, this is a rare opportunity to make a lasting impact.