Stack Using Array Definition In Data Structure Jobs Hiring Now Jobs in Usa
39,111 positions found — Page 3
Overview
We are seeking a seasoned Analytics leader to build and lead our enterprise Analytics and Data Governance function in a modern group purchasing / procurement environment. This leader will turn our rich ecosystem of member, supplier, contract, and transaction data into a strategic asset that drives savings, compliance, growth, and differentiated insight for our members and suppliers.
This leader will also own the data governance operating model, enterprise metrics, and analytics roadmap that power member-facing insights, internal performance management, and AI use cases across the technology platform (Website, B2B eCommerce, supplier portal, sourcing tools, and partner integrations).
Key responsibilities
Data governance and policy
- Define and run the enterprise data governance framework covering member, supplier, contract, item, and transaction data domains.
- Establish data ownership and stewardship across functions (Category Management, Supplier Management, Finance, Sales, Marketing, Digital) driving clear accountabilities for data quality and definitions.
- Implement policies for responsible use of data in supplier programs, member reporting, and AI/ML models, ensuring compliance with contractual, regulatory, and privacy requirements.
- Drive data quality management (profiling, remediation, SLAs) for critical assets such as contract price files, item catalogs, rebate/accrual data, and member hierarchies.
- Oversee metadata, business glossary, and data lineage so teams can confidently understand "one source of truth" for core GPO metrics (e.g., committed vs. actual spend, penetration, compliance, savings delivered).
Analytics strategy and delivery
- Define the enterprise analytics vision and roadmap aligned to procurement value levers: spend visibility, category performance, contract compliance, leakage detection, rebate optimization, and supplier performance.
- Lead the design and delivery of standardized KPI suites and dashboards for executives, category teams, supplier partners, and member account teams (e.g., savings scorecards, compliance heatmaps, portfolio optimization).
- Partner with Product and Engineering to ensure the data platform (warehouse, semantic layer, BI tools) can support self-service analytics, embedded insights in member/supplier portals, and AI-driven use cases.
- Champion enterprise metrics and advanced analytics capabilities such as, forecasting, benchmarking, opportunity sizing, and integrity analytics, ensuring models are traceable, governed, and auditable.
- Translate business needs into clear data products (curated data sets, subject-area marts, APIs) that serve both internal teams and external-facing solutions.
Stakeholder leadership and collaboration
- Serve as the enterprise "single point of accountability" for data and analytics, aligning priorities across Technology, Category Management, Supplier Relations, Sales, Finance, and Operations.
- Partner with Supplier and Member-facing teams to co-create analytics offerings that differentiate the GPO (e.g., supplier growth playbooks, member CFO dashboards, public-sector transparency packs).
- Educate executives and business leaders on data literacy, standard metrics, and how to use insights in planning, negotiations, and supplier programs.
- Collaborate closely with Security, Legal, and Compliance to ensure that member and supplier data is used ethically and in line with contracts and regulations.
Team building and operations
- Build and lead a high-performing team of data analysts, analytics engineers, data governance managers, and data stewards.
- Define operating rhythms (data council, data domain forums, metric review cadences) that keep governance and analytics tightly connected to business outcomes.
- Establish and track KPIs for the data function itself (data quality scores, adoption of governed datasets, BI usage, time-to-insight).
- Select and manage key tools and vendors in the analytics and governance ecosystem (warehouse, BI, catalog/governance, quality monitoring).
Qualifications
- Bachelor's or Master's degree in Data/Computer Science, Information Systems, Analytics, Statistics, Business, or related field.
- 10+ years of experience in analytics, data governance, or enterprise data management, including 3–5+ years leading teams.
- Proven experience in a procurement, supply chain, GPO, distribution, or B2B marketplace environment strongly preferred.
- Demonstrated success implementing data governance frameworks and delivering analytics that directly influenced commercial or procurement outcomes (e.g., savings, compliance, supplier growth).
- Hands-on familiarity with modern data platforms (e.g., Snowflake/BigQuery/Redshift, dbt, Power BI/Tableau/Looker, and one or more data catalog/governance tools).
- Strong grasp of regulatory / contractual considerations relevant to member and supplier data (data sharing agreements, use of benchmarking, privacy/security standards).
- Excellent leadership, storytelling, and stakeholder management skills; able to influence at C-suite and board levels.
Attributes for success
- Business-first mindset: instinctively ties data work to member value, supplier value, and financial impact.
- Pragmatic operator: balances governance rigor with speed, enabling innovation rather than blocking it.
- Skilled translator: can convert complex data and AI topics into clear narratives for executives, sales, and category leaders.
- Culture builder: passionate about creating a data-driven culture that values standard definitions, trusted data, and measurable outcomes.
Compensation:
$150,000 to $200,000 per year annual salary.
Exact compensation may vary based on several factors, including skills, experience, and education.
Benefit packages for this role include: Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law.
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.
This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.
This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.
Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.
The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.
Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.
Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.
Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.
Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.
Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.
Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.
Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.
Support adoption of metadata management, data catalog, and lineage capabilities.
Ensure analytics solutions align with university standards for security, privacy, and responsible data use.
Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.
Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.
Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.
Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.
Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.
Skill in the use of office productivity software such as Office 365 or Google Workspaces.
Ability to lead presentations and training for large groups.
Ability to manage communications and relationships with technical and business stakeholders.
Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.
Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.
Ability to ensure documentation standards and procedures are implemented for all team responsibilities.
Ability to define deadlines and manage the quality of the work delivered.
Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.
Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.
Physical Demands: Sedentary work.
Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.
Repetitive motion.
Substantial movements (motions) of the wrists, hands, and/or fingers.
The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.
Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.
Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.
One (1) year leading or supervising professional staff.
Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.
Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.
Experience managing or mentoring data professionals in a collaborative team environment.
Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.
Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.
Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.
Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.
Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.
Strong communication skills with the ability to engage both technical and non-technical audiences.
Experience using Jira or similar tools for work intake, project tracking, and prioritization.
Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.
Telework is not a guaranteed work arrangement.
Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.
F1 STEM OPT support is not available for this position.
Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .
Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .
Background Checks: Offers of employment are contingent on completion of a background check.
Information reported by the background check will not automatically disqualify anyone from employment.
Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.
The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.
Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.
Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.
EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.
All qualified applicants will receive equal consideration for employment.
Please read the University’s Equal Employment Opportunity Statement of Policy.
Title IX Non-Discrimination Notice See above description for requirements
IFBF is Iowa's largest farm organization, established in 1918.
We remain a statewide, non-profit, grassroots farm organization dedicated to creating a vibrant future for agriculture, farm families, and rural communities.
The Information Resources department is responsible for creating systems to manage memberships and support the ongoing business of Iowa Farm Bureau.
What You'll Do: We are seeking an experienced and skilled Senior Full Stack Developer with expertise in Azure, C#, .NET, SQL, API integration, and frontend development frameworks like Angular.
As a senior developer, you will play a pivotal role in designing, developing, and deploying scalable web applications and cloud-based solutions that support our business needs.
You will work closely with cross-functional teams to ensure our applications are secure, high-performing, and user-friendly, utilizing best practices in cloud architecture, API management, and identity management via Azure Entra ID.
You will also: • Architect, design, and develop full stack applications and APIs using C#, .NET, SQL, and Angular for both internal and external-facing applications.
• Leverage Azure cloud services, including Azure App Services, Azure Functions, Azure SQL, and Azure Storage, to build scalable, reliable applications.
Develop, deploy, and manage RESTful APIs that enable data and functionality sharing across platforms, ensuring optimal performance and scalability.
Implement authentication and authorization using Azure Entra ID, including single sign-on, multi-factor authentication, and role-based access control (RBAC).
Work with SQL Server and other database systems to design schemas, optimize queries, and manage database performance.
Build and maintain user interfaces using Angular and other frontend frameworks, ensuring a responsive, consistent, and user-friendly experience.
Ensure the quality and reliability of code through best practices, including unit testing, integration testing, code reviews, and adherence to coding standards.
Provide comprehensive documentation for applications, APIs, and systems architecture; support troubleshooting and performance optimization as needed.
Mentor junior developers, participate in code reviews, and collaborate with cross-functional teams to align technology solutions with business goals.
What It Takes to Join Our Team: • Bachelor's degree in Computer Science, Information Technology, or related field.
• 5+ years of experience in full stack development withy a focus on Azure, C#, .NET, and Angular.
• Strong proficiency in C#, .NET, Azure, SQL, API Design, Angular and Azure Entra ID required.
• Strong analytical and problem-solving skills, with a solution-oriented mindset.
• Ability to work both independently and collaboratively in a team environment.
• Excellent communication and documentation skills.
Experience with DevOps practices and tools, such as Azure DevOps, CI/CD pipelines, and version control (Git) preferred.
Familiarity with containerization (Docker) and orchestration (Kubernetes) in the Azure ecosystem preferred.
Experience in optimizing cloud architecture for cost-effectiveness and scalability preferred.
What We Offer You: When you're on our team, you get more than a great paycheck.
You'll hear about career development and educational opportunities.
We offer an enhanced 401K with a match, a defined benefit plan, low-cost health, dental, and vision benefits, and life and disability insurance options.
We also offer paid time off, including holidays and volunteer time, and teams who know how to have fun.
Add to that an onsite wellness facility with fitness classes and programs, a daycare center, and a cafeteria.
Iowa Farm Bureau....where the grass really IS greener! Work Authorization/Sponsorship: Applicants must be currently authorized to work in the United States on a full-time basis.
We are not able to sponsor now or in the future, or take over sponsorship of, an employment visa or work authorization for this role.
For example, we are not able to sponsor OPT status
*Must be US Citizen or Green Card Holder
We are seeking an experienced Data Science Manager to lead a high-impact team focused on developing and deploying machine learning solutions within a large-scale healthcare technology environment.
This role combines technical leadership, hands-on data science expertise, and strategic collaboration to drive innovation across analytics and AI initiatives. The ideal candidate will be a player-coach, capable of leading a team while remaining engaged in solution design and development.
Key Responsibilities
- Lead, mentor, and develop a team of data scientists, fostering a collaborative and high-performing environment
- Oversee the full data science lifecycle, including problem definition, data exploration, model development, validation, and production deployment
- Design and guide the development of machine learning and statistical models that drive business value
- Translate complex business requirements into scalable, data-driven solutions
- Partner closely with Product, Engineering, and business stakeholders to align on priorities and deliver impactful outcomes
- Establish and promote best practices in data science, machine learning, and model lifecycle management
- Communicate technical concepts and insights effectively to non-technical audiences
- Contribute to strategic planning and roadmap development for data and AI initiatives
- Bachelor's degree in Computer Science, Data Science, Engineering, or related field (or equivalent experience)
- 8+ years of experience in data science, analytics, or machine learning development
- 5+ years of experience managing or leading data science or analytics teams
- Strong hands-on experience building and deploying production-grade machine learning models
- Experience working within Agile or modern software development environments
- Deep understanding of machine learning methodologies, statistical modeling, and data architecture
- Proven ability to collaborate cross-functionally and influence stakeholders
- Experience working with healthcare data or healthcare technology platforms
- Experience building or scaling a data science or AI team
- Advanced degree (Master's or PhD) in a related field
- Experience driving AI/ML initiatives that support decision-making or recommendation systems
- This is a remote position (US-based candidates only)
Role : Technical Product Manager ( Data Analytics )
Location - Austin, TX (Onsite) - only Local to Texas (other states don't apply)
Exp Req : 10+
Rate : $55/Hr on W2 Max
Skills Mandatory :
1, Marketing Data Analysis knowledge.
2, KPI and metrics definition on Marketing Data. Mainly for media product.
3, Instrumentation knowledge and through process.
Original JD:
- Key Qualification 7+ years of experience in a Data Visualization, Data Scientist, or Data Analyst role, preferably for a digital subscription business.
- Strong proficiency with SQL-based languages is required. Experience with large-scale data technologies such as Hadoop, PySpark
- Proficiency with data visualization tools such as Tableau, , and/or MicroStrategy for analysis, insight synthesis, data product delivery, and executive presentation.
- You have a curious business mindset with an ability to condense complex concepts and analysis into clear and concise takeaways that drive action.
- Excellent communication, social, and presentation skills with meticulous attention to detail.
- Strong time management skills with the ability to handle multiple projects with tight deadlines and executive visibility.
- Be known for successfully bridging analytics and business teams, with an ability to speak the language of both.
Job Description :
- Build dashboards, self-service tools, and reports to analyze and present data associated with customer experience, product performance, business operations, and strategic decision-making.
- Create datasets, Develop global dashboards, data pipelines, sophisticated security controls, and scalable ad-hoc reporting
- Closely partner with our Data Science team to define metrics, datasets, and automation strategy
- Engage with Product, Business, Engineering, and Marketing teams to capture requirements, influence how our services are measured, and craft world-class tools to support those partners.
- Establish a comprehensive roadmap to communicate and manage our commitments and stakeholder expectations while enabling org-wide transparency on progress.
- Focus on scale and efficiency - create and implement innovative solutions and establish best practices across our full scope of delivery
- Education Minimum of a Bachelor’s degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or related field. Technical Product Management
Key Qualifications :
- Experience in a Technical Product Management role, preferably for a digital-media or subscription business.
- Knowledge of Client-Server metrics logging strategies as well as data architecture required for analysis
- Hands-on experience with the end-to-end data lifecycle across petabyte-scale technologies
- Prior experience in a technical role (preferably as a data analyst or engineer), delivering data insights to stakeholders
- Strong experience designing and driving product strategy cross-functionally, collaborating with partners of various technical levels.
Nice to have :
• Experience in data-related programming languages (e.g. SQL, PySpark, Python, or R)
Description :
- Data is our product. We are looking for a self-starting, upbeat individual with excellent communication skills who is passionate about managing and developing critical datasets to maximize Data Science capabilities. You should have a strong interest in driving large-scale data products, engaging with key business stakeholders, and driving critical communications throughout the business.
Stephen
Lead Talent Acquisition Specialist
Email :
The Data Engineering Manager is responsible for leading and developing a team of Data Architects and Data Solutions Engineers while actively contributing to hands-on technical projects. This role will manage the data warehouse in Snowflake, engineering automations in Alteryx and/or other solutions, while ensuring efficient project intake and prioritization. The ideal candidate combines strong technical expertise with proven technical leadership skills to drive innovation and operational excellence across the data engineering function.
As a Data Engineering Manager, you will:
- Set the technical strategy for data engineering solutions and data architecture which includes end to end data pipeline strategy, consumption management, project scoping, and data automation.
- Design, develop, and optimize data engineering solutions using Snowflake, DBT, Azure Data Factory, and Alteryx.
- Continuously assess and optimize the data engineering technology stack to ensure scalability, performance, and alignment with industry best practices.
- Implement best practices for data modeling, ETL/ELT processes, and automation.
- Own and maintain the Snowflake data warehouse roadmap and engineering standards.
- Lead data project scoping, prioritization, and resource allocation to ensure timely delivery of data engineering solutions.
- Ensure data integrity, security, and compliance across all engineering solutions.
- Collaborate with IT and rest of data teams to align solutions with enterprise
- Establish documentation and governance standards for data engineering workflows ensuring completeness, audit readiness, and traceability in alignment with enterprise architecture.
- Directly supervise the Data Architecture & Data Engineering team in accordance with Nicolet's policies and applicable laws. Responsibilities include interviewing, hiring, and training employees; planning, assigning, and directing work; appraising performance; coaching, mentoring and development planning; rewarding and disciplining employees; addressing complaints and resolving problems.
Qualifications:
- Bachelor's degree in Computer Science, Data Engineering, Data Analytics or related field.
- 7+ years in data engineering or related data roles required.
- 3+ years in leadership or management positions required.
- Strong technical expertise in Snowflake, DBT, Azure Data Factory and SQL or like systems.
- Familiarity with Alteryx, UiPath, Tableau, Power BI and Salesforce is preferred.
- Ability to design and implement scalable data solutions.
- Excellent leadership, communication, and organizational skills
- Ability to balance hands-on development with team development.
- Must be able to work fully in-office. This position does not allow for remote work.
Benefits:
- Medical, Dental, Vision, & Life Insurance
- 401(k) with a company match
- PT0 & 11 1/2 Paid Holidays
The above statements are intended to describe the general nature and level of work being performed. They are not intended to be construed as an exhaustive list of all responsibilities and skills required for the position.
Equal Opportunity Employer/Veterans/Disabled
Role : Technical Product Manager ( Data / Analytics )
Location - Austin, TX ( Onsite )
Exp Req : 10+
Skills Mandatory :
1, Marketing Data Analysis knowledge.
2, KPI and metrics definition on Marketing Data. Mainly for media product.
3, Instrumentation knowledge and through process.
Original JD:
- Key Qualification 7+ years of experience in a Data Visualization, Data Scientist, or Data Analyst role, preferably for a digital subscription business.
- Strong proficiency with SQL-based languages is required. Experience with large-scale data technologies such as Hadoop, PySpark
- Proficiency with data visualization tools such as Tableau, , and/or MicroStrategy for analysis, insight synthesis, data product delivery, and executive presentation.
- You have a curious business mindset with an ability to condense complex concepts and analysis into clear and concise takeaways that drive action.
- Excellent communication, social, and presentation skills with meticulous attention to detail.
- Strong time management skills with the ability to handle multiple projects with tight deadlines and executive visibility.
- Be known for successfully bridging analytics and business teams, with an ability to speak the language of both.
Job Description :
- Build dashboards, self-service tools, and reports to analyze and present data associated with customer experience, product performance, business operations, and strategic decision-making.
- Create datasets, Develop global dashboards, data pipelines, sophisticated security controls, and scalable ad-hoc reporting
- Closely partner with our Data Science team to define metrics, datasets, and automation strategy
- Engage with Product, Business, Engineering, and Marketing teams to capture requirements, influence how our services are measured, and craft world-class tools to support those partners.
- Establish a comprehensive roadmap to communicate and manage our commitments and stakeholder expectations while enabling org-wide transparency on progress.
- Focus on scale and efficiency - create and implement innovative solutions and establish best practices across our full scope of delivery
- Education Minimum of a Bachelor's degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or related field. Technical Product Management
Key Qualifications :
- Experience in a Technical Product Management role, preferably for a digital-media or subscription business.
- Knowledge of Client-Server metrics logging strategies as well as data architecture required for analysis
- Hands-on experience with the end-to-end data lifecycle across petabyte-scale technologies
- Prior experience in a technical role (preferably as a data analyst or engineer), delivering data insights to stakeholders
- Strong experience designing and driving product strategy cross-functionally, collaborating with partners of various technical levels.
Nice to have :
• Experience in data-related programming languages (e.g. SQL, PySpark, Python, or R)
Description :
- Data is our product. We are looking for a self-starting, upbeat individual with excellent communication skills who is passionate about managing and developing critical datasets to maximize Data Science capabilities. You should have a strong interest in driving large-scale data products, engaging with key business stakeholders, and driving critical communications throughout the business.
Must be local to TX
Skills:
Delivery manager
2026 road map
To deliver roadmap, interact with business, explain value prop, understand their rules, standard rules
Manage timelines
Partner with segments
Before and after Data Quality scores
Technical
Articulate technical design and solutions
Capabilities of Collibra, Soda
How to use those tools
Proactive communication skills
12+ years kind of role Technical Project Manager with solutioning and problem skills
Role Summary
The Data Governance Lead will design, build, and scale an enterprise data governance program from the ground up, using Collibra as the core platform for a large real estate enterprise. This senior role combines strategic leadership, hands‐on Collibra configuration, stakeholder management, and deep domain knowledge of real estate data. The incumbent will own the governance vision, operating model, and tooling, and will partner with business, IT, data engineering, analytics, legal, and compliance teams.
Key Responsibilities
1. Data Governance Strategy and Operating Model
- Define and implement the enterprise data governance strategy, roadmap, and operating model aligned to business objectives.
- Define governance KPIs, maturity metrics, and success measures.
- Drive adoption through change management, communications, and training.
2. Collibra Implementation from Scratch
- Lead end‐to‐end Collibra implementation: platform setup, environment planning (Dev/Test/Prod), domain modeling, and taxonomy design.
- Customize asset models for real estate use cases.
- Configure and manage Business Glossary, Data Dictionary, Data Catalog, and Reference Data & Code Sets.
- Design and implement Collibra workflows for glossary lifecycle, owner/steward assignment, issue management, and escalation.
- Implement Collibra operating model with defined roles (Data Owner, Data Steward, Custodian, Consumer) and RACI mappings.
- Integrate Collibra with data warehouses/lakes (Snowflake, BigQuery, Azure), BI tools (Power BI, Tableau), and ETL/ELT tools (Informatica, dbt, ADF).
- Lead metadata ingestion across technical, operational, and business metadata.
3. Data Ownership, Stewardship, and Accountability
- Define and institutionalize data ownership and stewardship across business units.
- Partner with business leaders to assign Data Owners and Stewards.
- Drive accountability for data definitions, data quality, and metadata completeness.
- Establish Data Governance Councils and working groups.
4. Data Quality and Issue Management
- Collaborate with data quality teams to define Critical Data Elements (CDEs) and align rules and thresholds.
- Configure Collibra issue management workflows and ensure traceability from issues to root causes and remediation actions.
- Provide governance oversight for remediation and continuous improvement.
5. Compliance, Risk, and Security Governance
- Define governance controls for regulatory compliance, contractual data, and financial reporting.
- Partner with Legal, Risk, and Security to classify sensitive data and apply access and usage policies.
- Implement data classification and privacy metadata within Collibra.
6. Stakeholder and Program Leadership
- Serve as the single point of accountability for the data governance program.
- Present progress, metrics, and risks to senior leadership.
- Mentor governance analysts, stewards, and platform administrators.
- Coordinate with system integrators and vendors as required.
Required Skills and Qualifications
Mandatory
- 12–18+ years in data management, data governance, or analytics leadership.
- Deep hands‐on experience implementing Collibra from scratch at enterprise scale.
- Strong expertise in business glossary and metadata management, stewardship models, and workflow automation in Collibra.
- Proven track record driving enterprise adoption of governance platforms.
- Excellent stakeholder management and communication skills.
Preferred
- Experience in real estate, property management, construction, facilities, or capital projects.
- Familiarity with DAMA‐DMBOK, DCAM, or similar governance frameworks.
- Exposure to data quality tools such as SODA, Great Expectations, or Informatica DQ.
- Experience integrating Collibra with cloud data platforms.
- Prior experience leading governance programs in large, federated organizations.
- Collibra certification is a plus.
Behavioral and Leadership Attributes
- Strategic thinker with strong execution capability.
- Balances business pragmatism with governance rigor.
- Influences without formal authority and drives change.
- Excellent storytelling and change management skills.
- Hands‐on leader who can configure Collibra and mentor teams.
Success Measures First 12 Months
- Collibra platform live with core real estate domains onboarded.
- Business glossary adopted across key business units.
- Formal data ownership established for critical datasets.
- Measurable improvement in metadata completeness and data quality visibility.
- Governance operating model embedded into daily business processes.
- At least 3-year experience as architect for large scale cloud data projects involving minimum 3 technological tracks as mentioned above in hyper scaler platforms.
- Minimum 6 years' expertise in data and analytics area.
- Deep understanding of databases and analytical technologies in the industry including MPP and NoSQL databases, Data Lake, Data Warehouse design, BI reporting and Dashboard development.
- Experience of Data architecture, data governance, data quality standards, and data security practices in at least 2 implementations.
- Experience in customer data models and developing KPIs out of customer data.
- Experience in customer facing roles to provide Solutions for Data use cases.
- Certification in the Cloud based data stack.
- Experience in deployment of a large distributed Big Data Application
- Track record of thought leadership and innovation around Big Data. Solid understanding of Data landscape and related emerging technology.
Technical skills needed:
- Languages – Java, Python, Scala
- AWS – S3, EMR, Glue, Redshift, Athena, Lamda
- Azure – Blob, ADLS, ADF, Synapse, PowerBI
- Google Cloud – Bigquery, DataProc, Looker
- Snowflake
- Databricks
- CDH - Hive, Spark, HDFS, Kafka CDH etc.
- ETL – Informatica/DBT/Mattilion,
Roles & Responsibilities:
- Architect and Design Sales and Marketing data initiatives, demonstrate data architectural knowledge, customer management and innovation
- Delivery of customer Cloud data Strategies for Marketing, aligned with customer's business objectives and with a focus on Cloud Migrations
- Provide leadership in platform migration methodologies and techniques including governance frameworks, guidelines, and best practices.
- Build point of views, thought leadership, solutions for proposals, competency development and mentoring etc.
- Solution Design experience on Data Lake, Data Warehouse, BI, Data Mart and Analytics systems
- Delivery of customer Cloud Strategies, aligned with customer's business objectives and with a focus on Cloud Migration.