Data Migration Examples Jobs in Usa

11,520 positions found — Page 3

Data Operations Lead
✦ New
Salary not disclosed
Minneapolis, MN 1 day ago

About US Solar

US Solar is a developer, owner, operator, and financier of solar and solar + storage projects, with a focus on emerging state markets, community solar programs, distributed generation and small-scale utility projects nationwide.


US Solar is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We believe diverse teams and diverse perspectives lead to better outcomes and breakthrough thinking, which are differentiators in any business and fundamental to our long-term success.


About Sunscription

is US Solar’s platform for managing community solar subscriptions, billing, and customer operations across multiple markets. The platform supports both residential and commercial subscribers, enabling them to participate in community solar projects and receive savings on their electric bills.


The Subscription Data Operations Lead will join the Sunscription team and play a critical role in supporting contract execution, allocation accuracy, and financial closings by serving as the central owner of subscription data and documentation.


Position Description

The Subscription Data Operations Lead serves as the primary data input and coordination point for community solar subscriptions. This role owns the accuracy and flow of information across allocation spreadsheets, executed contracts, utility documentation, and internal systems.

The position requires strong execution within US Solar’s current Excel based allocation and mail merge workflows, while also supporting improvements to automation, documentation, and reporting processes over time. The successful candidate will be detail oriented, systems minded, and comfortable operating in a fast paced environment where processes continue to evolve.


Responsibilities

  • Serve as the primary owner of subscription data across allocation spreadsheets, contracts, utility documentation, and internal platforms.
  • Execute and maintain Excel based allocation models and mail merge workflows used to generate contracts and supporting documentation.
  • Ensure consistency and accuracy between modeled allocations, executed agreements, and utility records through regular validation and reconciliation.
  • Administer the execution and recording of commercial subscription agreements and associated costs to support long term contract management, cost, and revenue tracking.
  • Track and analyze residential subscriber acquisition activity to monitor program progress, validate enrollment data, and support allocation planning
  • Organize and maintain allocation lists, contracts, utility bills, and utility documentation required for enrollment, billing, and ongoing management.
  • Create and maintain subscription summaries and documentation required for program and project financial closings.
  • Track additional documentation requirements as projects move toward COD and financial close.
  • Migrate deal information and documentation accurately and completely into internal subscriber billing and management platform
  • Standardize documentation and reporting formats to improve consistency and accessibility for internal stakeholders.
  • Identify opportunities to streamline manual processes and improve efficiency within existing Excel and document generation workflows.
  • Collaborate with accounting, finance, asset management, and the Sunscription team to support data needs across the customer lifecycle.
  • Create and deliver customer onboarding communication to support billing setup and closing requirements.
  • Perform process improvement and administrative tasks to support the overall success of community solar subscriptions.


Requirements

  • Bachelor’s degree and five or more years of professional experience in operations, data management, finance, or a related field.
  • Exceptional attention to detail with strong organizational skills.
  • Advanced proficiency in Microsoft Excel and experience managing complex spreadsheets.
  • Experience executing document generation or mail merge workflows tied to structured data.
  • Comfort working with contracts, utility documentation, and operational data.
  • Ability to learn new tools and contribute to the gradual improvement of existing systems and processes.
  • Strong communication skills and ability to collaborate across teams.
  • Self directed and comfortable working independently in a fast paced environment.
  • Interest in renewable energy and community solar programs.
  • US Solar seeks individuals who are flexible, motivated, responsible, and eager to contribute to a collaborative team environment.
Not Specified
Senior Data Engineer
Salary not disclosed
Fremont, CA 3 days ago

The Company

A rapidly growing data consultancy founded in 2023 by a former venture-backed biotech VC data/technology leadership team in San Francisco. The firm has already delivered 20+ engagements across tech, healthcare/biotech, finance, energy, real estate, and startups - building complex data platforms, products, and AI-driven systems.


The Role

A hands-on, senior individual contributor role for engineers who still love coding. You’ll work in small teams (often 1–3 engineers) to design and build production-grade data platforms, pipelines, and products across industries.


What You’ll Work On

  • High-impact, fixed-scope builds (e.g., enterprise data marts, complex migrations)
  • End-to-end data platform deployments (ETL, warehouses, BI across AWS/Azure/GCP)
  • Partnering with startups to build data-intensive products from 0 → 1


What We’re Looking For

Hands-on builder

  • Actively writing production code today
  • Not removed into management or purely architectural roles

Infrastructure ownership

  • Personally deployed and operated production systems
  • Cloud, CI/CD, scaling, monitoring, reliability

End-to-end ownership

  • Taken products from idea → launch → ongoing operation
  • Comfortable operating autonomously with stakeholders

True seniority (well beyond 5 years)

  • Targeting engineers with meaningful depth and ownership
  • Strong preference for backgrounds in smaller, high-ownership environments
  • Experience wearing multiple hats (application + infrastructure + deployment)


Why Join

  • High autonomy and real technical ownership
  • Variety of industries and problems
  • Small, elite engineering team
  • Opportunity to shape a fast-scaling consultancy


Location: San Francisco (5 days a week on-site)

Salary: $190k-$250k + 10-20% bonus + equity + sign on bonus

Benefits: Full Health, Vision, Dental, Life Insurance, Commuter Benefits, Unlimited Time off, 401k matched.

Not Specified
Adobe CJA Data Analytics
✦ New
🏢 Dexian
Salary not disclosed
Eden Prairie, MN 1 day ago

Data Analyst (Adobe Data Analytics/ CJA consultant )

Remote

C2H role



Education / Certification

Required: Bachelor’s Degree



Experience

  • 4–6 years of related experience required
  • Strong proficiency in JavaScript with hands-on implementation and troubleshooting
  • Experience with Adobe Analytics implementation and/or Adobe CJA migration preferred
  • Experience with tag management systems, preferably Adobe Launch
  • Expertise in data layer design and implementation
  • Experience tagging both native mobile applications and websites
  • Experience working with content management systems such as AEM and frameworks like React and React Native


Disqualifier

  • No experience working with cross-functional teams

Additional Qualities

  • SQL skills
  • Healthcare industry experience


Top 3 Must-Have Hard Skills (Ranked)

  1. Adobe Launch and JavaScript – hands-on implementation and troubleshooting
  2. Mobile application analytics implementation experience
  3. Experience with different content management systems

Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals.

Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit to learn more.

Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status

Not Specified
Junior data analyst/Machine learning engineer
✦ New
Salary not disclosed
Oakland 1 day ago
CS/IT Graduates or About to be Grads.

Get Hired by taking action.

If you just graduated (or you're about to) and the job search is already feeling confusing, you're not imagining it.

A degree proves you can learn—but employers hire for job readiness: projects that look like real work, current tech stacks, interview confidence, and the ability to contribute on day one.

That's why many new grads send hundreds of applications and still hear nothing back.

It's not because you're "not smart enough.” It's because most entry-level pipelines are crowded, and hiring teams filter heavily for candidates who look production-ready.

We are actively considering candidates for entry-level software engineering and data roles, especially Java full stack, Java/Python development, DevOps automation, data analytics, data engineering, data science, and ML/AI—full-time opportunities aligned to client needs.

Our core emphasis remains Java/Full Stack/DevOps and Data/Analytics/Engineering/ML.

SynergisticIT focuses on two high-demand lanes: Java / Full Stack / DevOps and Data (Data Analyst, Data Engineer, Data Scientist) + ML/AI—so you don't graduate with scattered skills, you graduate with an employable stack.

SynergisticIT since 2010, has helped candidates land full-time roles at major organizations (examples often cited include Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Banking, Wayfair, Client, Client, and more) with offers commonly in the $95k–$154k range depending on role and skill depth.

For a new grad, the bigger message isn't the number—it's that results require a structured pathway, not random applications.

Here's a realistic way to think about your advantage as a fresh graduate: you're early enough to build the right foundation before bad habits set in.

If you master fundamentals—coding, debugging, data structures, system thinking—and then layer modern tools on top (frameworks, cloud, CI/CD, analytics stacks), you become the kind of "entry-level” candidate who actually feels like a safe hire.

What roles are companies hiring for right now? A typical market demand pattern is clear: organizations still need entry-level software programmers, Java full stack developers, Python/Java developers, DevOps-focused engineers, and on the data side data analysts, BI analysts, data engineers, data scientists, and machine learning engineers.

The strongest candidates aren't "tool collectors”—they're people who can show end-to-end capability: build an API, connect a database, deploy a service, analyze data, explain results, and handle interviews calmly.

Why fresh grads get stuck— Fresh grads often struggle for four predictable reasons: Resume doesn't match job keywords (ATS filters you out).

Projects look like school assignments (not production-aligned).

Interview skills are undertrained (DSA, system design, SQL, behavioral).

No structured pipeline (random applying without feedback loops).

A job-placement-first approach addresses these systematically: build the right portfolio, practice the right interview questions, align your tech stack to roles, and keep improving until the market says "yes.” Who this path fits best If you're a recent graduate, you'll likely fit if you match any of these: New grads in CS, Engineering, Math, or Statistics with limited job experience Students finishing Bachelor's or Master's programs who need a real hiring plan Candidates who apply consistently but don't get callbacks Candidates who reach interviews but struggle to close International students on F-1/OPT who need a job plan for STEM extension/H-1B timing Graduates with strong academics but thin practical experience SynergisticIT helps STEM extension and work authorization pathways, and for candidates who need long-term stability, support related to H-1B and green card processes as part of employer-side realities.

If you're tired of guessing, stop treating your job search like a lottery.

Treat it like a project with milestones: skills → portfolio → interview readiness → targeted applications → scheduled interviews → offer.

If you want to explore, here are the key links: Event videos (OCW, JavaOne, Gartner): USA Today feature Contact & get a roadmap: Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Bottom line for fresh grads: Your degree is the starting line, not the finish line.

If you want to get hired faster, you don't need "more random courses.” You need a guided, job-focused path and the right people around you.

In tech, it's not just what you learn—it's how you learn and who you build with that decides how far you go.

Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.

Resume submissions may be shared with our JOPP team database also.

Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume
Not Specified
Lecturer - College of Computing, Data Science, and Society
Salary not disclosed
Berkeley, CA 2 days ago
Position overview

Salary range:
The UC academic salary scales set the minimum pay at appointment. See the following table for the current salary scale for this position: . The current full-time salary range for this position is $70,977-$199,722. Placement on the scale is commensurate with college teaching experience.

Percent time:
15% to 100%

Anticipated start:
Positions usually start in July or August for Fall, January for Spring and June for Summer.

Review timeline:
Applications will be accepted and reviewed for unit needs through November 2027. Applications are typically considered in April and May for fall course needs, in September and October for spring course needs, and February and March for summer course needs. The pool will close November 2027; applicants wishing to remain in the pool after that time will need to submit a new application.

Application Window


Open date: June 10, 2025




Most recent review date: Tuesday, Jun 24, 2025 at 11:59pm (Pacific Time)

Applications received after this date will be reviewed by the search committee if the position has not yet been filled.




Final date: Wednesday, Nov 25, 2026 at 11:59pm (Pacific Time)

Applications will continue to be accepted until this date, but those received after the review date will only be considered if the position has not yet been filled.



Position description

The College of Computing, Data Science, and Society at the University of California, Berkeley invites applications for a pool of qualified temporary lecturers to teach CDSS courses should an opening arise. Screening of applicants is ongoing and will continue as needed. The number of positions varies from semester to semester (fall, spring and summer sessions), depending on the needs of the unit.



About CDSS:



Established July 1, 2023, the College of Computing, Data Science, and Society (CDSS) is the first new college at Berkeley in over 50 years. The College was created to meet the demands and opportunities at a time when data touches nearly every aspect of our lives. Innovations in computing and statistics are converging to create unprecedented opportunities to use data science, machine learning, and artificial intelligence to tackle pressing societal challenges from human health to climate change.



CDSS offers outstanding undergraduate major programs in Computer Science, Data Science, and Statistics. Over 1,500 students graduated with a degree in these majors in Spring 2024, and one in four held a second major in another discipline. CDSS undergraduates study with faculty from a wide range of fields, where they gain the knowledge, skills, and experiences needed to succeed in today's datafied world, interact with data ethically, and masterfully engage as informed leaders.



CDSS seeks candidates who can support the success of all students through inclusive curriculum, classroom environment, and pedagogy.



Responsibilities:



CDSS is seeking outstanding instructors to be appointed in the non-Senate Lecturer title series who can teach small and large courses. We are particularly interested in instructors who can teach courses that satisfy the Human and Social Dynamics of Data and Technology requirement for the college. This requirement is designed for the purpose of developing an understanding of how technology and data interact with human and societal contexts, including ethical considerations and applications such as education, health, law, natural resources, and public policy. Examples include: Anthropology of Science, Data, and Technology; Artificial Humanities: AI, Language, and Fiction; and Data and Justice.



Teaching a CDSS course may include holding office hours, assigning grades, advising students, preparing course materials (e.g.,slides, syllabus, homework assignments), providing clear and prompt feedback on student work, and maintaining the course website.



Please note: The use of a lecturer pool does not guarantee that an open position exists. See the review date specified in AP Recruit to learn whether the unit is currently reviewing applications for a specific position. If there is no future review date specified, your application may not be considered at this time.



Division:



Qualifications

Basic qualifications (required at time of application)

Must have an advanced degree or be enrolled in an advanced degree program at the time of application.



Additional qualifications (required at time of start)

Advanced degree. Candidates must already be authorized to work in the United States.



Preferred qualifications

A Ph.D. or equivalent international degree that is cross-disciplinary with data science in either the social sciences, humanities, education, health, law, natural resources, public policy, computer science, statistics, or engineering, is preferred.



Ability to support the success of all students through inclusive curriculum, classroom environment, and pedagogy.



Application Requirements

Document requirements

  • Curriculum Vitae - Your most recently updated C.V.


  • Cover Letter


  • Statement of Teaching - Please discuss prior teaching experience, teaching approach, and future teaching interests. This can include, for example, specific efforts, accomplishments, and future plans to support the success of all students through inclusive curriculum, classroom environment, and pedagogy.




Reference requirements
  • 3-4 required (contact information only)


Apply link:
JPF04959

Help contact:



About UC Berkeley

UC Berkeley is committed to diversity, equity, inclusion, and belonging in our public mission of research, teaching, and service, consistent with UC Regents Policy 4400 and University of California Academic Personnel policy (APM 210 1-d). These values are embedded in our Principles of Community, which reflect our passion for critical inquiry, debate, discovery and innovation, and our deep commitment to contributing to a better world. Every member of the UC Berkeley community has a role in sustaining a safe, caring and humane environment in which these values can thrive.



The University of California, Berkeley is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, or protected veteran status.



For more information, please refer to the University of California's Affirmative Action and Nondiscrimination in Employment Policy and the University of California's Anti-Discrimination Policy.



In searches when letters of reference are required all letters will be treated as confidential per University of California policy and California state law. Please refer potential referees, including when letters are provided via a third party (i.e., dossier service or career center), to the UC Berkeley statement of confidentiality prior to submitting their letter.



As a University employee, you will be required to comply with all applicable University policies and/or collective bargaining agreements, as may be amended from time to time. Federal, state, or local government directives may impose additional requirements.


Unless stated otherwise, unambiguously, in the position description, this position does not include sponsorship of a new consular H-1B visa petition that would require payment of the $100,000 supplemental fee.



As a condition of employment, the finalist will be required to disclose if they are subject to any final administrative or judicial decisions within the last seven years determining that they committed any misconduct.




  • "Misconduct" means any violation of the policies or laws governing conduct at the applicant's previous place of employment, including, but not limited to, violations of policies or laws prohibiting sexual harassment, sexual assault, or other forms of harassment or discrimination, as defined by the employer.
  • UC Sexual Violence and Sexual Harassment Policy
  • UC Anti-Discrimination Policy
  • APM - 035: Affirmative Action and Nondiscrimination in Employment


Job location
Berkeley, CA
Not Specified
Informatica Data Analyst
🏢 nLeague
Salary not disclosed
Denver 3 days ago
Web accessibility developer Job Id: Digital Accessibility -Website Developer Client: CT DAS Duration: 06 Months Location: Remote, Hartford, CT.

Job Description: The State of Connecticut (CT) is seeking a Digital Accessibility Web Developer with deep experience in remediating accessibility issues across a wide range of platforms and technologies.

You will partner closely with our accessibility testers and analysts to turn accessibility audit findings into fully remediated digital experiences that meet or exceed compliance standards.

The ideal candidate will have expert-level experience remediating accessibility barriers in CMS systems such as Sitecore, Salesforce, and custom web applications (HTML/ARIA/CSS/JavaScript), as well as working knowledge of AWS services, Biznet platforms, and enterprise databases.

You will be hands-on in HTML and accessibility markup remediation, working primarily within the State's CMS platforms and custom HTML environments.

You'll partner with digital accessibility testers to review audit findings and make front end code corrections to ensure WCAG 2.1 AA compliance.

Remediation Focus Areas Apply accessibility fixes to front-end code and markup issues identified through audits (i.e.

color corrections, alt text, heading structure, keyboard navigation, link roles, ARIA roles) Modify and restructure HTML, CSS, and ARIA to comply with WCAG 2.1 AA standards Work within CMS platforms like Sitecore, Salesforce, and Wordpress to correct issues in templates, content types, and presentation layers Support content and design teams with accessibility guidance for remediating documents, forms, and embedded media Use defect tracking tools (JIRA) to manage tickets and document fixes Collaborate with accessibility testers and content strategists to validate remediated work and prevent recurrence of issues Share knowledge and remediation patterns with other developers to promote consistency and sustainability Required Knowledge, Skills, and Ability Bachelor's degree in Computer Science, Software Engineering, IT, or related field 4 years of experience remediating digital accessibility issues in websites, apps, and platforms Strong coding experience in HTML, CSS, JavaScript, and ARIA markup Working knowledge of Sitecore and Salesforce platforms, with demonstrated remediation success Familiarity with Biznet applications, AWS infrastructure, or common enterprise back-end platforms Ability to interpret automated and manual testing results (e.g., Axe, ANDI, NVDA, JAWS) and apply solutions Expert knowledge of WCAG 2.1 AA standards and assistive technology interactions Proficiency in CMS templates, JavaScript frameworks, backend API configuration, and UI component libraries Experience troubleshooting keyboard traps, focus management, form label/field logic, and responsive layouts Strong ability to work in agile sprints, manage remediation tickets, and track progress in Jira or similar tools Ability to collaborate with QA testers, content editors, and project managers in an agile environment Excellent communication and documentation skills for communicating fixes and coaching teams Preferred Skills and Qualifications Experience with Sitecore MVC or SXA customization Front-end developer or CMS certifications Accessibility remediation tools Experience with customized CMS themes, templates, and components Strong attention to content structure (heading levels, alt text, semantic HTML) Experience remediating PDF, Word, or PowerPoint documents (for secondary support) Familiarity with CI/CD integration of accessibility checks (i.e., axe-core in pipelines) Familiarity with design handoff tools (i.e., Figma or Adobe XD) for accessibility review Desired Certifications One or more of the following: IAAP WAS (Web Accessibility Specialist) strongly preferred IAAP CPACC DHS Trusted Tester Certification Deque University Developer Track Certificate Salesforce Accessibility Champion or similar Prior PowerCenter → IDMC migration, Experience or familiarity with Linux system administration activities
Not Specified
Databricks Architect/ Senior Data Engineer
✦ New
🏢 OZ
Salary not disclosed
Boca Raton, FL 1 day ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Data Integration & AI Engineer
✦ New
Salary not disclosed
Edison, NJ 1 day ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
Data Governance Manager
✦ New
Salary not disclosed
Dallas, TX 4 hours ago

Must be local to TX


Skills:


Delivery manager

2026 road map

To deliver roadmap, interact with business, explain value prop, understand their rules, standard rules

Manage timelines

Partner with segments

Before and after Data Quality scores


Technical

Articulate technical design and solutions

Capabilities of Collibra, Soda

How to use those tools

Proactive communication skills


12+ years kind of role Technical Project Manager with solutioning and problem skills

Role Summary

The Data Governance Lead will design, build, and scale an enterprise data governance program from the ground up, using Collibra as the core platform for a large real estate enterprise. This senior role combines strategic leadership, hands‑on Collibra configuration, stakeholder management, and deep domain knowledge of real estate data. The incumbent will own the governance vision, operating model, and tooling, and will partner with business, IT, data engineering, analytics, legal, and compliance teams.


Key Responsibilities


1. Data Governance Strategy and Operating Model

  • Define and implement the enterprise data governance strategy, roadmap, and operating model aligned to business objectives.
  • Define governance KPIs, maturity metrics, and success measures.
  • Drive adoption through change management, communications, and training.


2. Collibra Implementation from Scratch

  • Lead end‑to‑end Collibra implementation: platform setup, environment planning (Dev/Test/Prod), domain modeling, and taxonomy design.
  • Customize asset models for real estate use cases.
  • Configure and manage Business Glossary, Data Dictionary, Data Catalog, and Reference Data & Code Sets.
  • Design and implement Collibra workflows for glossary lifecycle, owner/steward assignment, issue management, and escalation.
  • Implement Collibra operating model with defined roles (Data Owner, Data Steward, Custodian, Consumer) and RACI mappings.
  • Integrate Collibra with data warehouses/lakes (Snowflake, BigQuery, Azure), BI tools (Power BI, Tableau), and ETL/ELT tools (Informatica, dbt, ADF).
  • Lead metadata ingestion across technical, operational, and business metadata.


3. Data Ownership, Stewardship, and Accountability

  • Define and institutionalize data ownership and stewardship across business units.
  • Partner with business leaders to assign Data Owners and Stewards.
  • Drive accountability for data definitions, data quality, and metadata completeness.
  • Establish Data Governance Councils and working groups.


4. Data Quality and Issue Management

  • Collaborate with data quality teams to define Critical Data Elements (CDEs) and align rules and thresholds.
  • Configure Collibra issue management workflows and ensure traceability from issues to root causes and remediation actions.
  • Provide governance oversight for remediation and continuous improvement.


5. Compliance, Risk, and Security Governance

  • Define governance controls for regulatory compliance, contractual data, and financial reporting.
  • Partner with Legal, Risk, and Security to classify sensitive data and apply access and usage policies.
  • Implement data classification and privacy metadata within Collibra.


6. Stakeholder and Program Leadership

  • Serve as the single point of accountability for the data governance program.
  • Present progress, metrics, and risks to senior leadership.
  • Mentor governance analysts, stewards, and platform administrators.
  • Coordinate with system integrators and vendors as required.


Required Skills and Qualifications


Mandatory

  • 12–18+ years in data management, data governance, or analytics leadership.
  • Deep hands‑on experience implementing Collibra from scratch at enterprise scale.
  • Strong expertise in business glossary and metadata management, stewardship models, and workflow automation in Collibra.
  • Proven track record driving enterprise adoption of governance platforms.
  • Excellent stakeholder management and communication skills.


Preferred

  • Experience in real estate, property management, construction, facilities, or capital projects.
  • Familiarity with DAMA‑DMBOK, DCAM, or similar governance frameworks.
  • Exposure to data quality tools such as SODA, Great Expectations, or Informatica DQ.
  • Experience integrating Collibra with cloud data platforms.
  • Prior experience leading governance programs in large, federated organizations.
  • Collibra certification is a plus.


Behavioral and Leadership Attributes

  • Strategic thinker with strong execution capability.
  • Balances business pragmatism with governance rigor.
  • Influences without formal authority and drives change.
  • Excellent storytelling and change management skills.
  • Hands‑on leader who can configure Collibra and mentor teams.


Success Measures First 12 Months

  • Collibra platform live with core real estate domains onboarded.
  • Business glossary adopted across key business units.
  • Formal data ownership established for critical datasets.
  • Measurable improvement in metadata completeness and data quality visibility.
  • Governance operating model embedded into daily business processes.
Not Specified
Data Analyst Manager
✦ New
Salary not disclosed
Hickory, NC 4 hours ago

Who We Are

At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.

What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.

Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?


Role Summary:

The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.


Responsibilities:

Data Architecture & Tooling

  • Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
  • Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
  • Ensure alignment between data infrastructure and evolving business needs across distribution operations
  • Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite

Data Governance & Quality

  • Define and enforce data standards, metric definitions, and naming conventions across all business domains
  • Establish data ownership, lineage documentation, and access governance policies
  • Implement monitoring and alerting for data quality issues across source systems and the warehouse
  • Build and maintain a data dictionary that serves as the single source of truth for the organization

Orchestration of Analysts & Systems

  • Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
  • Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
  • Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
  • Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business


Qualifications:

Required

  • 3-5 years of experience in data analytics, business intelligence, or data engineering
  • 2+ years in a lead or management role overseeing analysts or data team members
  • Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
  • Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
  • Proven track record establishing data governance standards and documentation practices
  • Experience integrating and managing multiple data sources across SaaS and ERP platforms
  • Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders

Preferred

  • Experience in distribution, wholesale, or supply chain environments
  • Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
  • Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
  • Background working within IT-led or hybrid IT/Analytics teams


Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life & Disability insurance
  • 401(K) with company match


Company Paid holidays and PTO:

  • Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
  • We offer 10 company paid holidays and 1 floating holiday per year.


Perks:

  • Parking provided (Charlotte office and onsite at Hickory office)
  • Employee Engagement team
  • Monthly stipend to pursue an active lifestyle


Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.

Not Specified
jobs by JobLookup
✓ All jobs loaded