Syniti Data Migration Training Jobs in Usa

14,808 positions found — Page 5

Databricks Architect/ Senior Data Engineer
✦ New
🏢 OZ
Salary not disclosed
Boca Raton, FL 1 day ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Data Architect - Consumer Platform
✦ New
Salary not disclosed

The pay range for this role is $150,000 - $200,000/yr USD.


WHO WE ARE:


Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.


ABOUT THE ROLE:


Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.


The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).


You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.


WHAT YOU’LL DO:


  • Responsible for the full technical life cycle of consumer platform capabilities which includes:
  • Capability roadmap and technical architecture in alignment to consumer experience
  • Technical planning, design, and execution
  • Operations, analytics/reporting, and adoption
  • Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
  • Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
  • Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
  • Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
  • Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
  • Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
  • Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
  • Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
  • Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.


REQUIREMENTS:


  • Computer Science, Data Engineering, or related degree or equivalent experience.
  • 12+ years experience architecting enterprise data platforms in cloud environments.
  • 9+ years experience with data engineering with a focus on consumer data.
  • 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
  • Strong experience with Data 360 and identity resolution architectures.
  • Proven expertise in SQL performance tuning and large-scale data modeling.
  • Hands-on experience implementing ML pipelines and recommender systems in production environments.
  • Experience with cloud technologies (AWS, GCP, or Azure).
  • Experience with integration patterns (API, ETL, event streaming).
  • Experience providing technical leadership and guidance across multiple projects and development teams.
  • Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
  • Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
  • Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
  • Experience with Databricks or similar distributed data/ML platforms preferred.
Not Specified
Data Analyst Manager
✦ New
Salary not disclosed
Hickory, NC 17 hours ago

Who We Are

At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.

What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.

Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?


Role Summary:

The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.


Responsibilities:

Data Architecture & Tooling

  • Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
  • Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
  • Ensure alignment between data infrastructure and evolving business needs across distribution operations
  • Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite

Data Governance & Quality

  • Define and enforce data standards, metric definitions, and naming conventions across all business domains
  • Establish data ownership, lineage documentation, and access governance policies
  • Implement monitoring and alerting for data quality issues across source systems and the warehouse
  • Build and maintain a data dictionary that serves as the single source of truth for the organization

Orchestration of Analysts & Systems

  • Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
  • Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
  • Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
  • Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business


Qualifications:

Required

  • 3-5 years of experience in data analytics, business intelligence, or data engineering
  • 2+ years in a lead or management role overseeing analysts or data team members
  • Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
  • Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
  • Proven track record establishing data governance standards and documentation practices
  • Experience integrating and managing multiple data sources across SaaS and ERP platforms
  • Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders

Preferred

  • Experience in distribution, wholesale, or supply chain environments
  • Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
  • Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
  • Background working within IT-led or hybrid IT/Analytics teams


Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life & Disability insurance
  • 401(K) with company match


Company Paid holidays and PTO:

  • Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
  • We offer 10 company paid holidays and 1 floating holiday per year.


Perks:

  • Parking provided (Charlotte office and onsite at Hickory office)
  • Employee Engagement team
  • Monthly stipend to pursue an active lifestyle


Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.

Not Specified
Technical Product Manager (Data Analytics)
✦ New
🏢 Nexwave
Salary not disclosed
Austin, TX 17 hours ago

Role : Technical Product Manager ( Data Analytics )

Location - Austin, TX (Onsite) - only Local to Texas (other states don't apply)

Exp Req : 10+


Rate : $55/Hr on W2 Max


Skills Mandatory :

1, Marketing Data Analysis knowledge.

2, KPI and metrics definition on Marketing Data. Mainly for media product.

3, Instrumentation knowledge and through process.


Original JD:

  • Key Qualification 7+ years of experience in a Data Visualization, Data Scientist, or Data Analyst role, preferably for a digital subscription business.
  • Strong proficiency with SQL-based languages is required. Experience with large-scale data technologies such as Hadoop, PySpark
  • Proficiency with data visualization tools such as Tableau, , and/or MicroStrategy for analysis, insight synthesis, data product delivery, and executive presentation.
  • You have a curious business mindset with an ability to condense complex concepts and analysis into clear and concise takeaways that drive action.
  • Excellent communication, social, and presentation skills with meticulous attention to detail.
  • Strong time management skills with the ability to handle multiple projects with tight deadlines and executive visibility.
  • Be known for successfully bridging analytics and business teams, with an ability to speak the language of both.

Job Description :

  • Build dashboards, self-service tools, and reports to analyze and present data associated with customer experience, product performance, business operations, and strategic decision-making.
  • Create datasets, Develop global dashboards, data pipelines, sophisticated security controls, and scalable ad-hoc reporting
  • Closely partner with our Data Science team to define metrics, datasets, and automation strategy
  • Engage with Product, Business, Engineering, and Marketing teams to capture requirements, influence how our services are measured, and craft world-class tools to support those partners.
  • Establish a comprehensive roadmap to communicate and manage our commitments and stakeholder expectations while enabling org-wide transparency on progress.
  • Focus on scale and efficiency - create and implement innovative solutions and establish best practices across our full scope of delivery
  • Education Minimum of a Bachelor’s degree in Computer Science, Statistics, Mathematics, Engineering, Economics, or related field. Technical Product Management


Key Qualifications :

  • Experience in a Technical Product Management role, preferably for a digital-media or subscription business.
  • Knowledge of Client-Server metrics logging strategies as well as data architecture required for analysis
  • Hands-on experience with the end-to-end data lifecycle across petabyte-scale technologies
  • Prior experience in a technical role (preferably as a data analyst or engineer), delivering data insights to stakeholders
  • Strong experience designing and driving product strategy cross-functionally, collaborating with partners of various technical levels.


Nice to have :

• Experience in data-related programming languages (e.g. SQL, PySpark, Python, or R)


Description :

  • Data is our product. We are looking for a self-starting, upbeat individual with excellent communication skills who is passionate about managing and developing critical datasets to maximize Data Science capabilities. You should have a strong interest in driving large-scale data products, engaging with key business stakeholders, and driving critical communications throughout the business.


Stephen

Lead Talent Acquisition Specialist

Email :

Not Specified
Director of Business Intelligence & Data Services
✦ New
Salary not disclosed
Corvallis, OR 1 day ago
  • The Director of BI and Data Services will be primarily managing Power BI and Epic Cogito. Preferred candidates would reside in Oregon with the ability to come onsite.


  • We are open to considering candidates remote out of state in which we can employ in the following states: Alabama, Alaska, Arizona, Arkansas, Connecticut, Florida, Georgia, Idaho, Indiana, Iowa, Kansas, Kentucky, Louisiana, Michigan, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Mexico, North Carolina, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, West Virginia, or Wisconsin


JOB SUMMARY/PURPOSE

  • Oversees the design, implementation, and maintenance work of business intelligence, analytics, and data architecture managers. Responsible for the strategic leadership and direction of data architecture and business intelligence strategies to meet the data needs of the organization. Ensures system standards and compatibility with SHS information systems architecture, tools, policies, and procedures. Ensures that data initiatives align with organizational objectives and data governance policies and procedures. Responsible for budgeting of IS Business Intelligence and Data Services department, ensuring resources are allocated efficiently and cost-effectively to support the organization's goals. Leads maturation of Samaritan Health Services as a data driven, data literate health system, working with clinicians, business stakeholders, researchers, informaticists, IS colleagues, and others. Creates a culture of teaching, mentoring and professional development across the system. Maintains consultative, trusted-advisor relationships with stakeholders.


DEPARTMENT DESCRIPTION

  • Information Services is committed to providing leadership, support and coordination of technology at Samaritan Health Services. The IS Business Intelligence and Data Services department provides comprehensive data support as a centralized resource for the organization. Services include data integration and warehousing, analytic reporting, business intelligence solutions, statistical data analysis, and data science solutions.


EXPERIENCE/EDUCATION/QUALIFICATIONS

  • Bachelor's degree required (preferably in an Information Systems related field). An advanced degree in information systems, mathematics, statistics, analytics, engineering or a related field preferred.
  • Eight (8) years professional and/or leadership experience in data analytics/business intelligence/data science required.
  • Five (5) years experience in the management of technical resources required.
  • Two (2) years experience in business intelligence/data science delivery required.
  • Project management and continuous process improvement experience required.
  • Experience with Epic Cogito strongly preffered.
  • Experience in multiple data-related technologies required.
  • Project management certifications preferred.
  • Experience providing advanced business intelligence and data science solutions in a healthcare setting preferred.


KNOWLEDGE/SKILLS/ABILITIES

  • Leadership - Inspires, motivates, and guides others toward accomplishing goals. Achieves desired results through effective people management.
  • Conflict resolution - Influences others to build consensus and gain cooperation. Proactively resolves conflicts in a positive and constructive manner.
  • Critical thinking – Identifies complex problems. Involves key parties, gathers pertinent data and considers various options in decision making process. Develops, evaluates and implements effective solutions.
  • Communication and team building – Leads effectively with excellent verbal and written communication. Delegates and initiates/manages cross-functional teams and multi-disciplinary projects.
Not Specified
Lead Data Warehouse Engineer
✦ New
Salary not disclosed
New York, NY 11 hours ago

Description

The Scientific Computing & Data group at the Icahn School of Medicine at Mount Sinai (ISMMS) accelerates scientific discovery by supporting a high-performance computing and research data ecosystem. This includes a data commons and two clinical research data warehouses: one for ISMMS and one for the Kidney Precision Medicine Project (KPMP), a multi-institutional research consortium ( ) funded by the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK). Both warehouses use Microsoft SQL Server and the OMOP Common Data Model.


The Lead Data Warehouse Engineer is a senior technical specialist responsible for leading development, maintenance, and operations of these research data warehouses. The role collaborates with warehouse team members and research stakeholders to expand functionality and integrate new data sources. Data transformations are built in Transact-SQL stored procedures, with SSIS used for orchestration.


Responsibilities

  • Design databases and pipelines that balance functionality, performance, cost, and development time; evaluate technical options with the product manager.
  • Design, build, test, and maintain ETL/ELT processes using T-SQL stored procedures, SSIS, and SQL Agent; apply metadata-driven design for extensibility.
  • Serve as a team leader; contribute to project planning, work breakdown, dependency sequencing, and release management.
  • Develop and promote standards, conventions, design patterns, DevOps/SDLC best practices, and operational procedures for pipelines and warehouse maintenance.
  • Mentor junior engineers in data warehousing, data engineering skills, and operational support.
  • Design, build, and maintain data management processes, including loading flat files (csv, tsv, pipe-delimited, JSON).
  • Lead design sessions, code walkthroughs, peer reviews, and produce technical documentation.
  • Tune database objects, stored procedures, and pipelines to optimize performance and minimize compute and storage costs.
  • Monitor database and pipeline operations; lead troubleshooting and remediation of failures; provide occasional after-hours on-call support.
  • Collaborate with DBAs and system administrators on backups, performance tuning, statistics/index maintenance, and patching.
  • Provide high-quality customer service to researchers, clinicians, and internal partners; maintain a science‑driven, customer-focused approach.
  • Ensure patient privacy and data security in compliance with IRB & cybersecurity policies, HIPAA, 42 CFR Part 2, NYS Article 27-F, and other regulations.
  • Stay current with emerging technologies to improve capabilities, efficiency, quality, or cost.
  • Identify improvements in procedures, technology, compliance, and data privacy/security.
  • Periodically assist DBAs with user provisioning, backups, restorations, capacity planning, and performance monitoring.
  • Perform related duties as assigned.


Qualifications

  • Bachelor’s degree in a technical field; Master’s preferred.
  • 12–15 years of related experience, including 7+ years designing, developing, and maintaining relational databases, data pipelines, and dimensional/OLAP warehouses.


Preferred

  • Expert knowledge of data warehousing: 3NF & dimensional modeling (fact table types, SCDs), change data capture, incremental loads, data lineage, source-to-target mappings, pattern-based & parameter-driven development.
  • Expert-level experience with Microsoft SQL Server technologies: T-SQL, indexing, stored procedures, UDFs, sequences, dynamic SQL, Linked Servers, SSIS, Visual Studio, SSDT, and SQL Agent.
  • Experience with DevOps/SDLC best practices; Agile (Scrum, Kanban) with JIRA and Confluence; version control with git.
  • Strong communication and customer service skills for working with researchers, clinicians, administrators, and IT staff.
  • Excellent critical thinking, problem-solving, multitasking, and collaboration skills; ability to work independently in a fast-paced environment.
  • Preferred experience with healthcare data (EHR, billing/claims, cost accounting), Epic Clarity/Caboodle, data models (OMOP, i2b2, PCORnet).
  • Preferred experience with Azure Synapse, Azure Data Factory, Oracle PL/SQL, PostgreSQL PL/pgSQL.
  • Experience with SQL Server administration: configuration, performance tuning, partitioning, materialized views, permissions, backups & restorations.
  • Preferred experience with scripting in Windows & Linux (PowerShell, Python, or similar); HL7; web services/REST APIs; reporting tools like SSRS, Power BI, Tableau.


Strength through Unity and Inclusion


The Mount Sinai Health System is committed to fostering an environment where everyone can contribute to excellence. We share a common dedication to delivering outstanding patient care. When you join us, you become part of Mount Sinai’s unparalleled legacy of achievement, education, and innovation as we work together to transform healthcare. We encourage all team members to actively participate in creating a culture that ensures fair access to opportunities, promotes inclusive practices, and supports the success of every individual.


At Mount Sinai, our leaders are committed to fostering a workplace where all employees feel valued, respected, and empowered to grow. We strive to create an environment where collaboration, fairness, and continuous learning drive positive change, improving the well-being of our staff, patients, and organization. Our leaders are expected to challenge outdated practices, promote a culture of respect, and work toward meaningful improvements that enhance patient care and workplace experiences. We are dedicated to building a supportive and welcoming environment where everyone has the opportunity to thrive and advance professionally. Explore this opportunity and be part of the next chapter in our history.


About the Mount Sinai Health System:


Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 48,000 employees working across eight hospitals, more than 400 outpatient practices, more than 300 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex health care challenges of our time — discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it. Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive health care solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients’ medical and emotional needs at the center of all treatment. The Health System includes more than 9,000 primary and specialty care physicians; 13 joint-venture outpatient surgery centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and more than 30 affiliated community health centers. We are consistently ranked by U.S. News & World Report's Best Hospitals, receiving high "Honor Roll" status, and are highly ranked: No. 1 in Geriatrics, top 5 in Cardiology/Heart Surgery, and top 20 in Diabetes/Endocrinology, Gastroenterology/GI Surgery, Neurology/Neurosurgery, Orthopedics, Pulmonology/Lung Surgery, Rehabilitation, and Urology. New York Eye and Ear Infirmary of Mount Sinai is ranked No. 12 in Ophthalmology. U.S. News & World Report’s “Best Children’s Hospitals” ranks Mount Sinai Kravis Children's Hospital among the country’s best in several pediatric specialties. The Icahn School of Medicine at Mount Sinai is ranked No. 11 nationwide in National Institutes of Health funding and in the 99th percentile in research dollars per investigator according to the Association of American Medical Colleges. Newsweek’s “The World’s Best Smart Hospitals” ranks The Mount Sinai Hospital as No. 1 in New York and in the top five globally, and Mount Sinai Morningside in the top 20 globally.


Equal Opportunity Employer


The Mount Sinai Health System is an equal opportunity employer, complying with all applicable federal civil rights laws. We do not discriminate, exclude, or treat individuals differently based on race, color, national origin, age, religion, disability, sex, sexual orientation, gender, veteran status, or any other characteristic protected by law. We are deeply committed to fostering an environment where all faculty, staff, students, trainees, patients, visitors, and the communities we serve feel respected and supported. Our goal is to create a healthcare and learning institution that actively works to remove barriers, address challenges, and promote fairness in all aspects of our organization.


Compensation


The Mount Sinai Health System (MSHS) provides salary ranges that comply with the New York City Law on Salary Transparency in Job Advertisements. The salary range for the role is $145200 - $217875 annually. Actual salaries depend on a variety of factors, including experience, education, and operational need. The salary range or contractual rate listed does not include bonuses/incentive, differential pay or other forms of compensation or benefits.

Not Specified
Data Analyst, Strategic Insights & Visualization
✦ New
Salary not disclosed
Dallas, TX 1 day ago

About Us:

Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.


Security Advisory: Beware of Frauds

Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.


As the Data Analyst, Strategic Insights & Visualization you will play a dual role within our data organization: you will be the primary storyteller for our business performance, but you will also be a hands-on technical practitioner responsible for the integrity of our reporting suite. You won’t just build dashboards; you will define the metrics that drive our strategy. Whether it’s untangling a complex business logic request, auditing data quality in our Power BI apps, or partnering with business leaders to prioritize their roadmaps, you will lead with data. If you are a proactive problem-solver who loves to turn raw numbers into actionable business narratives, this is the role for you.


Responsibilities

Technical Execution

  • Reporting & Visualization: Act as the primary resource for building and maintaining Power BI reports and dashboards. Personally manage the end-to-end deployment of Power BI Apps, ensuring high performance and intuitive user experiences.
  • Metric Logic: Write and optimize the SQL and DAX required for complex business logic. Work with the data engineering team to pull the necessary data across source systems. Take ownership of metric definitions to ensure consistency across all departments, from ERP inventory tracking to Ecommerce sales performance.
  • Support & Triage: Manage the support queue for reporting incidents. Investigate data discrepancies, perform root cause analysis on quality issues, and ensure that our "source of truth" remains accurate and trusted by the organization.

Leadership & Operations

  • Data Governance: Lead the development and maintenance of the enterprise data dictionary and business glossary. Ensure that all technical terms are translated into clear business language for non-technical stakeholders.
  • Quality Control: Define and implement data quality rules and readiness scoring. Monitor data freshness and completeness, proactively alerting the engineering team when pipelines impact reporting SLAs.
  • Security & Access: Help define access control and data security within the reporting environment, ensuring that users have the appropriate permissions and that sensitive data is protected according to company standards.

Stakeholder Collaboration

  • Business Liaison: Act as the primary bridge between the data team and business leaders. Translate vague requests ("we need better inventory insights") into clear technical requirements and prioritized project milestones.
  • Domain Prioritization: Participate in quarterly planning to sequence requests for Ecommerce, ERP, and Operations. Collaborate with the AI/ML team to prioritize use cases and define KPIs for advanced analytics initiatives.
  • Self-Service Enablement: Conduct work sessions with business users to promote BI tool adoption and empower departments to perform their own ad-hoc analysis.



Experience, Skills, & Ability Requirements

  • Bachelor’s degree in Business Analytics, Statistics, Information Systems, or equivalent professional experience
  • 3+ years of hands-on experience in a Data Analyst or Business Intelligence role, preferably supporting Ecommerce or Retail operations.
  • Proven track record of translating complex business requirements into robust, automated analytic reports and dashboards.
  • Strong SQL skills and the ability to write complex queries to extract and transform data
  • Proven proficiency in Power BI and DAX; experience managing Power BI service, workspaces, and app deployments.
  • Strong understanding of data modeling concepts, specifically Star Schema and dimensional design.
  • Experience with Microsoft Fabric or the Azure data stack.
  • Proactive attitude toward data quality and a "details-matter" mindset when auditing reports.
  • Excellent communication skills with the ability to explain complex data trends to executive stakeholders.
  • Familiarity with Tableau.
  • Microsoft Certified: Power BI Data Analyst Associate (PL-300).
  • Experience modeling datasets (such as inventory, sales, or web performance) to identify trends, correlations, and performance gaps.
  • Knowledge of basic Python for advanced forecasting or data manipulation.



What We Offer

  • Health, dental, and vision benefits
  • Paid parental leave
  • 401(k) with employer match
  • A culture of meritocracy that fosters ongoing growth opportunities
  • A stable, growing family-owned company that looks after its employees



Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.

Not Specified
Product Data Analyst
✦ New
🏢 Loloi Rugs
Salary not disclosed
Dallas, TX 1 day ago

Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.


Security Advisory: Beware of Frauds

Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.


We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.


Responsibilities

  • Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
  • Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
  • Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
  • API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
  • Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
  • Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.


What You Will NOT Do (The Boundaries)

  • No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
  • No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
  • No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.


Intersection with Technical Teams

  • With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
  • With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.


Experience, Skills, & Ability Requirements

  • 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
  • SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
  • API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
  • Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
  • You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
  • You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.


Bonus Points (Nice-to-Haves)

  • Familiarity with Visio/Lucidchart to visualize data flows.
  • Ability to build simple dashboards in Tableau to track data health scores.
  • Basic familiarity with Python or R for data manipulation.


What We Offer

  • Health, dental, and vision benefits
  • Paid parental leave
  • 401(k) with employer match
  • A culture of meritocracy that fosters ongoing growth opportunities
  • A stable, growing family-owned company that looks after its employees


Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.

Not Specified
Data QA Engineer
✦ New
Salary not disclosed
Dallas, TX 1 day ago

Title : Data QA Engineer

Location: Minneapolis , Dallas , Atlanta (Onsite)

Job Type : Contract

Exp : 8-15 Years


Key Responsibilities:

  • Design, build, and maintain automated data quality frameworks to validate accuracy, completeness, consistency, and timeliness of data.
  • Develop automation scripts using Python/SQL to test data pipelines, ETL/ELT processes, and analytics workflows.
  • Implement data quality checks and monitoring within Azure-based data platforms.
  • Work extensively with Azure services (ADF, ADLS, Synapse) and Databricks for large-scale data processing.
  • Integrate data quality validations into CI/CD pipelines and support proactive issue detection.
  • Perform root cause analysis for data issues and collaborate with data engineering, analytics, and business teams to resolve them.
  • Define and enforce data quality standards, metrics, and SLAs.

Required Skills & Qualifications:

  • Strong experience (8–15 years) in data engineering, data quality, or data automation roles.
  • Hands-on expertise with Azure data ecosystem and Databricks.
  • Strong programming skills in Python and SQL.
  • Experience building automated data validation and reconciliation frameworks.
  • Solid understanding of data warehousing, data lakes, and distributed data processing.
  • Familiarity with DevOps/CI-CD practices for data platforms.

Preferred Skills:

  • Experience with data observability or data quality tools.
  • Exposure to cloud-scale analytics and performance optimization.
  • Strong communication and stakeholder management skills.
Not Specified
Sr Data Analyst
✦ New
Salary not disclosed
Dallas, TX 11 hours ago

Title: Senior Data Analyst

Duration: Long term

Location: Dallas , TX



Job Description:

Primary responsibilities of the Senior Data Analyst include supporting and analyzing data anomalies for multiple environments including but not limited to Data Warehouse, ODS, Data Replication/ETL Data Management initiatives. The candidate will be in a supporting role and will work closely with Business, DBA, ETL and Data Management team providing analysis and support for complex Data related initiatives. This individual will also be responsible for assisting in initial setup and on-going documentation/configuration related to Data Governance and Master Data Management solutions. This candidate must have a passion for data, along with good SQL, analytical and communication skills.

Responsibilities

  • Investigate and Analyze data anomalies and data issues reported by Business
  • Work with ETL, Replication and DBA teams to determine data transformations, data movement and derivations and document accordingly
  • Work with support teams to ensure consistent and pro-active support methodologies are adhered to for all aspects of data movements and data transformations
  • Assist in break fix and production validation as it relates to data derivations, replication and structures
  • Assist in configuration and on-going setup of Data Virtualization and Master Data Management tools
  • Assist in keeping documentation up to date as it relates to Data Standardization definitions, Data Dictionary and Data Lineage
  • Gather information from various Sources and interpret Patterns and Trends
  • Ability to work in a team-oriented, fast-paced agile environment managing multiple
  • priorities


Qualifications

  • 4+ years of SQL experience working in OLTP, Data Warehouse and Big Data databases
  • 4+ years of experience working with Exadata and SQL Server databases
  • 4+ years in a Data Analyst role
  • Strong attention to Detail
  • 2+ years writing medium to complex stored procedures a plus
  • Ability to collaborate effectively and work as part of a team
  • Extensive background in writing complex queries
  • Extensive working knowledge of all aspects of Data Movement and Processing, including ETL, API, OLAP and best practices for data tracking
  • Good Communication skills
  • Self-Motivated
  • Works well in a team environment
  • Denodo Experience a plus
  • Master Data Management a plus
  • Big Data Experience a plus (Hadoop, MongoDB)
  • Postgres and Cloud Experience a plus
Not Specified
jobs by JobLookup
✓ All jobs loaded