With Query Example Sql Jobs in Usa

1,411 positions found — Page 10

Data Integration & AI Engineer
Salary not disclosed
Edison, NJ 2 days ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
AI Innovation Architect
🏢 Indev
Salary not disclosed
Washington, DC 2 days ago

AI Innovation Architect


Location: Hybrid; Ashburn, VA; Springfield, VA; Washington, D.C.


Clearance: U.S. Citizen; Must have an active Top-Secret Clearance or DHS Public Trust Clearance.


InDev is seeking a senior strategic and technical AI Architect responsible for designing, building, and deploying artificial intelligence solutions that support mission outcomes across the homeland security market. In this role you will bridge advanced AI capabilities - including machine learning, natural language processing, and data engineering - with operational requirements, ensuring solutions are secure, scalable, and aligned with the homeland security mission.


YOUR FUTURE DUTIES AND RESPONSIBILITIES

  • Define overall system architecture, selecting and governing Artificial Intelligence / Machine Learning (AI/ML) and platform technologies, and ensuring solutions are scalable, secure, and production-ready
  • Lead end-to-end technical design, development, and implementation of an agentic AI system to orchestrate user queries across enterprise data sources
  • Partner closely with development, DevOps and data engineering teams to translate project requirements into an extensible AI architecture
  • Create and promote AI strategies that align with business objectives
  • Develop and coordinate POCs to test new technologies
  • Evaluate and select appropriate AI tools, frameworks, and platforms (i.e., AWS, Azure, Google) to drive innovation


QUALIFICATIONS

  • U.S. Citizen; Active Top-Secret Clearance or DHS Public Trust Clearance
  • 8+ years of experience delivering AI solutions across federal agencies
  • Bachelor’s degree in Computer Science, Engineering, or Data Science
  • Deep understanding of machine learning (ML), deep learning, Natural Language Processing (NLP), and neural networks
  • Experience with cloud platforms (AWS, Google Cloud, Azure) and container orchestration tools like Kubernetes and Docker
  • Ability to identify high-impact AI use cases and translate them into technical requirements
  • Experience designing, building, and deploying advanced AI systems including Generative AI, AI Agents, LLMs, Reinforcement Learning, and computer vision models
  • Ability to apply cloud and engineering expertise across AWS, GCP, Kubernetes, Docker, Terraform, Helm, Linux, and AI services, such as SageMaker, Vertex AI, Bedrock, or Gemini
  • Experience with Python, agent frameworks, data engineering, APIs/microservices, vector databases, SQL engines, distributed systems, cloud services, RAG
  • Experience developing and maintaining AI/ML roadmaps, performing Analysis of Alternatives, and making defensible technical tradeoff decisions
  • Experience leading multidisciplinary teams, including data scientists, engineers, and business stakeholders
  • Excellent written and oral communication skills
  • Ability to tailor and present information across multiple stakeholders


NICE TO HAVES

  • Experience integrating AI solutions with SaaS/PaaS platforms (e.g., ServiceNow, Salesforce, etc.)
  • Experience implementing virtual agents within SaaS/PaaS platforms (e.g., ServiceNow Virtual Agent, Salesforce Agentforce, etc.)
  • Experience with Google Gemini


ABOUT US

At InDev, we’re not just a company; we’re a trailblazing force transforming the way data shapes the future. As a dynamic player in the federal government sector, we’re on a mission to empower agencies with cutting-edge data solutions that drive innovation, efficiency, and progress. Our team thrives on collaboration, innovation, and embracing challenges head-on to create a meaningful impact on the world around us.


WHY INDEV

  • Innovative Environment: Join a team that thrives on creativity and innovation, where your ideas are not only heard but encouraged.
  • Meaningful Impact: Contribute to projects that directly impact federal agencies, driving positive change on a national scale.
  • Dynamic Collaboration: Work alongside diverse experts who are passionate about pushing boundaries and making a difference.
  • Agile Mindset: Embrace Agile methodologies that encourage flexibility, adaptability, and rapid growth.
  • Learning Culture: Enjoy ongoing learning opportunities and professional development to expand your skill set.
  • Cutting-edge Tech: Engage with the latest technologies and tools in the data integration landscape.


If you’re ready to embark on a journey of innovation, collaboration, and impact, InDev welcomes you to join our team. Let’s shape the future together.

Not Specified
Full Time NonProfit FileMaker Developer
Salary not disclosed
Chicago, IL 2 days ago

About the Company



The HistoryMakers, a 501 ( c) (3) and the nation’s largest African American video oral history archive ( ), seeks to hire a Non Profit FileMaker Pro Developer to manage, maintain, and modernize our database systems built in Claris FileMaker. Our organization currently operates six interconnected FileMaker databases that support core business operations. The selected candidate will be responsible for cleaning up legacy structures, improving performance, ensuring data integrity, and upgrading the system to the latest FileMaker version. This is a hands-on technical role focused on database architecture, system optimization, and long-term maintenance. The ideal candidate will be comfortable working with complex relational systems and improving existing database designs.



About the Role



The selected candidate will be responsible for cleaning up legacy structures, improving performance, ensuring data integrity, and upgrading the system to the latest FileMaker version.



Responsibilities



  • Database Architecture & Maintenance
  • Analyze and document the structure of six interconnected FileMaker databases
  • Review and improve relational schema and relationship graphs
  • Identify and remove unused tables, fields, scripts, and layouts
  • Ensure proper indexing and relational integrity
  • Maintain and optimize the overall database architecture


  • Data Integrity & Cleanup
  • Identify duplicate or inconsistent records and implement cleanup procedures
  • Standardize data formats across databases
  • Implement validation rules and controlled data entry where needed
  • Develop procedures to ensure long-term data integrity


  • System Modernization & Upgrades
  • Upgrade databases to the latest version of Claris FileMaker Pro
  • Ensure compatibility with Claris FileMaker Server
  • Update scripts, layouts, and features that rely on deprecated functionality
  • Perform system testing to ensure stability during and after upgrades


  • Development & Automation
  • Design and maintain FileMaker scripts and custom functions
  • Develop layouts and user interfaces that improve usability and workflow
  • Automate repetitive tasks and reporting processes
  • Improve performance of existing scripts and database queries


Documentation

  • Document database structures, relationships, and workflows
  • Maintain technical documentation for scripts and system changes
  • Provide internal documentation to support future maintenance and training

Qualifications


  • Strong experience developing solutions in Claris FileMaker
  • Solid understanding of relational database design principles
  • Experience with FileMaker scripting, calculations, and relationship graphs
  • Experience troubleshooting and optimizing FileMaker performance
  • Ability to work with large datasets and complex legacy systems
  • Strong problem-solving and analytical skills


Required Skills


  • Experience managing systems using Claris FileMaker Server
  • Knowledge of SQL and external database integrations
  • Experience using ODBC or API integrations
  • Experience performing FileMaker version upgrades and system migrations


Pay range and compensation package


Salary is commensurate with experience and qualifications. The HistoryMakers also offers a competitive benefits package that includes 403(b), PTO, health, vision and dental insurance, tuition reimbursement and school loan repayment assistance.



The HistoryMakers is the digital repository for the Black experience: providing much needed content, role models, success pathways and frameworks for a 21st century citizenry that has become increasingly less tolerant, divisive and economically and educationally disparate. Please send resumes to: This position must work ON SITE at The HistoryMakers offices in Chicago's South Loop. This position is neither remote nor hybrid.

permanent
Senior Business Analyst
✦ New
Salary not disclosed
Dallas, TX 1 day ago

Job Title: Senior Business Analyst

Location: Dallas, TX


Essential Duties & Responsibilities:

· Act as a liaison between technical teams and the business users, including managing communications between business stakeholders and technical teams to enhance understanding of desired scope, requirements and assist project delivery teams in implementing accurate solutions.

· Gather, analyze, and document business requirements related investment lifecycle process, Portfolio management systems and alternative investments.

· Drive proactive & transparent communications with key stakeholders related to project scope and related assumptions to properly set and manage expectations.

· Excellent writing skills, with the ability to create clear requirements, specifications, and documentation.

· Work closely with technical developers, QA teams and ensure business requirements are met.

· Ability to do data research to identify the data quality issues.

· Strong time management & prioritization skills.

· Able to meet deliverables with high level of accuracy.

Contacts:

This position has frequent contact with all levels of employees and management. In addition, this role may interact with outside business partners, vendors, consultants, and other office visitors.

Education, Skills & Experience:

· Bachelor's Degree (BA/BS) or equivalent from four-year college or university.

· 8+ years’ experience as a business analyst, with a strong focus on Asset Management, Private Equity and Loan operations.

· Solid understanding of investment products, fund structures, and loan lifecycle events.

· Experience with portfolio and loan management systems.

· Proficiency in creating BRDs FRDs, user stores, process flows and data mapping documents.

· Strong analytical skills, attention to detail and problem-solving abilities.

· Excellent communication and stakeholder management skills.

· Familiarity with key database concepts including star schema, dimensions, facts and master data management (MDM)

Must-Have Qualifications:

· Strong understanding of investment life cycle process, Portfolio management systems.

· Extensive experience and subject-matter expertise in asset management, private equity and loan servicing process.

  • · Proficiency with SQL queries
Not Specified
Product Data Analyst (JOB ID 1989)
✦ New
Salary not disclosed
Atlanta, GA 1 day ago

*U.S. citizenship and residency in the Atlanta area are required*


OneSparQ is looking for a Product Data Analyst to contribute to a growing wholesale distributor in Atlanta, GA.


Required Skills:

  • Bachelors degree in information systems, Business Administration, Supply Chain or related field
  • 3–5 years of experience working with product data, ideally within wholesale distribution, manufacturing, or consumer products environments
  • Working knowledge and experience with enterprise-grade PIM Platforms (EnterWorks Preferred)
  • Data governance, GS1 standards knowledge, and data analytics
  • Proficiency in Microsoft Excel
  • Working knowledge of SQL, data querying, and database extraction techniques


Additional Skills: (not required)

  • ERP system knowledge (SAP, Oracle, Infor)
  • Power BI Experience


Responsibilities:

  • Manage and maintain product data within the Product Information Management (PIM) system, including product records, attributes, categories, and digital assets
  • Oversee the full lifecycle of product data, from product onboarding and enrichment to discontinuation
  • Collaborate with IT, Merchandising, Marketing, and Supply Chain teams to ensure product data is accurate, complete, and launch-ready
  • Support PIM system workflows, integrations, and testing to ensure smooth data management across internal systems
  • Maintain product data standards and conduct regular audits to ensure consistency, accuracy, and compliance
  • Create and manage product content including titles, descriptions, specifications, and marketing details
  • Coordinate the publication and distribution of product information to internal systems, eCommerce platforms, and external marketplaces
  • Build and maintain reporting dashboards to track data quality, completeness, and vendor compliance
  • Work with vendors and internal teams to resolve data issues and ensure proper product data submissions
  • Provide training and support to internal teams on PIM tools, processes, and data standards
Not Specified
Datacenter Technician
✦ New
🏢 Akkodis
Salary not disclosed
Santa Clara, CA 1 day ago

Akkodis is seeking an Engineering Technician role is a Contact with a clientin Santa Clara, CA (Onsite), We’re ideally looking for an applicant with 4+ years of equivalent experience in a Lab or Datacenter environment. Visio and CAD experience for Lab R&D projects and Rack Management. Experience with handling PDUs and Power in Labs.


Pay Range: $63-$65/hour; The rate may be negotiable based on experience, education, geographic location, and other factors.


Summary

  • We're looking for a motivated Engineering Technician (Contract) for support our Colossus quality assurance labs. In this role, you will be faced the challenge of providing a test-bed for our developers to test software on various hardware before releasing them.
  • Additionally, collaborate with Infrastructure Engineers, installing and maintaining Windows/Linux platforms and using creativity while finding solutions.
  • We expect things to break in this lab, as the software is mostly low-level device drivers, and the bugs in them do break boards and GPUs.
  • We seek to catch problems early in our labs rather than in user devices.
  • Our labs run more than 100,000 tests per day and is part of a DevOps pipeline that needs constant supervision, tracking, monitoring and break-fix.


What you'll be doing:

  • Handling Labs and Datacenters using DCIM Tools, spreadsheets and task tracking tools.
  • Your responsibilities will also include defining standards in labs to keep them safe, clean and organized.
  • Deploy test boards that run automated tests from a Software Developers and triage and root cause board issues which are not due to hardware or software issues but, that potentially have test setup issues.
  • Remove and redeploy boards that need software and/or hardware upgrades from board engineers in a regular cadence.
  • Work closely and pro-actively with other engineering teams such as system architects, chip and board designers, software/firmware engineers, HW/SW QA teams and Applications engineering teams to drive design, development, debug and release of next generations products.
  • Take active part in procurement decisions for Lab by choosing from various options available, getting test copies and doing proof of concepts and then providing recommendations.
  • Collect data for critical metrics for the lab and track progress.


What we need to see:

  • Associates or Bachelors Degree in a Tech related Major or 4+ years of equivalent experience in a Lab or Datacenter environment.
  • Ability to perform well at work without requiring constant manager supervision.
  • Ability to do deploy and cable servers and test equipment.
  • Basic user level understanding of Unix/Windows, and Networking with Enterprise Switches and Routers.
  • Skills to work with teammates of various abilities and experiences.
  • Ability to find tasks where you need help from sys-admins and communicate those, coordinate with them to integrate those solutions
  • Perseveration to debug a hard problem and out of box thinking to seek those.
  • To be successful in this position, you should have a love of working with close-knit, multi-disciplinary teams, and enjoy hands-on work with state of the art platforms.


Ways to stand out from the crowd:

  • Visio and CAD experience for Lab R&D projects and Rack Management.
  • Lab/Datacenter Procurement Experience.
  • Experience with handling PDUs and Power in Labs.
  • System administrator level experience on Unix/windows and knowledge of scripting to automate workflows (bash/python).
  • Basic knowledge of Git/Perforce to check-out, edit and check-in scripts.
  • Ability to write SQL queries to get data from MySQL DBs.









If you feel this is not something that you are currently interested in, but know of someone, that might be, please share the details with them or let me know their details so I can reach out to them!

Equal Opportunity Employer/Veterans/Disabled

Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.

To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:

· The California Fair Chance Act

· Los Angeles City Fair Chance Ordinance

· Los Angeles County Fair Chance Ordinance for Employers

· San Francisco Fair Chance Ordinance

Not Specified
Head of Business Operations
✦ New
Salary not disclosed
Reno, NV 1 day ago

Head of Business Operations


Brief Summary

The Head of Business Operations owns the configuration, integrity, and scalability of the company's business operations systems, serving as the bridge between business strategy and technical execution reporting directly to the CEO/Co-Founder. This role is responsible for translating institutional knowledge into scalable business processes,

ensuring data integrity, and enabling the transition from ad-hoc decision making to data-driven workflows. This is a senior management role with individual-contributor responsibilities, broad cross-functional authority, and high executive visibility.

The Head of Business Operations will take a lead role in defining the data architecture, implementing process guardrails, and analyzing operational data to drive strategy. This person acts as the cross-functional orchestrator of the business operations system, collaborating with Sales, Production, and Leadership to extract & refine business logic and codify it into streamlined processes. Success in this role requires a strong backbone to enforce higher standards, and an analytical and systems-thinking mindset to visualize downstream effects.


What Success Looks Like

● All core workflows are analyzable, have entrance/exit criteria, and are governed by continuously improving SOPs

● Leadership can answer key operational questions without ad-hoc data pulls

● Administrative overhead for sales and production staff is measurably reduced through intuitive, user-centric workflow design and automation.

● Data integrity is proactively enforced through automated validation gates, ensuring all transactions reaching Production meet technical completeness standards

● Schema changes follow a formal change process without disruptive production breakage

● Cross-team handoffs show measurable reductions in rework or delays

● Operational reporting has shifted from reactive status checks to predictive insights, providing automated triggers for churn risks and production bottlenecks


Duties & Responsibilities

Requirements Engineering (Internal Product Owner)

● Conduct structured interviews with stakeholders (Sales, Production) to extract complex business logic, transforming qualitative requirements into workflow pipelines, binary system gates, and automation triggers.

● Treat internal tools as a "Product" and internal staff as "Users," conducting user research to ensure workflows are intuitive and reduce friction.

● Act as the liaison between business stakeholders and technical teams to ensure alignment.

● Define, mandate, and manage the company's "Data Dictionary" and Standard Operating

Procedures (SOPs), ensuring a unified language and common framework is adopted across all functional teams.


System Ownership & Platform Governance

● Own the configuration and architecture of the company’s operating platform (currently ), defining object relationships and preventing schema drift.

● Translate strategic business objectives into system logic, automation rules, and workflows to create a scalable operating platform that generates measurable, actionable data.

● Define and enforce strict "Entrance and Exit Criteria" for all business process stages to prevent data errors (the enforcement aspect).

● Manage the change control process for system updates to prevent disruption to active workflows.

Business Intelligence

● Responsible for building decision-grade operational reporting and analysis (but not exploratory data science/research or data engineering).

● Query and analyze cross-functional data to drive strategic business decisions, identify performance gaps, and uncover opportunities for revenue optimization and growth (e.g., ROAS, marketing attribution, churn risks, customer LTV).

● Own and facilitate the weekly business review, working with management and leads to refine reporting and insights across the organization.

● Design and maintain management reporting dashboards to track key performance indicators and operational health.


Decision Authority

This role has final decision authority over the following areas:

● Operating system structure and data definitions

● Workflow stage definitions and gating logic

● Approval or rejection of system changes that affect data integrity


Desired Qualifications & Traits

● Systems Thinker: Possesses strong systems thinking capabilities, naturally visualizing the downstream effects of upstream changes (e.g., how a change in the Sales form affects the Production floor). They prioritize long-term scalability over short-term "hacks."

● Pragmatic Architect: Maintains a pragmatic approach to architecture, balancing "perfection with business utility." They know when to implement a rigid constraint and when to allow manual flexibility, always focused on delivering high-utility features.

● Operational Excellence Steward: Demonstrates operational discipline and the ability to define, promote, and enforce process compliance among diverse teams. They value consistency and predictability and are willing to say "No" when requests threaten system integrity and guide the team to the right trade-off.

● Analytical & Problem-Solving Mindset: Possesses an investigative nature, focusing on finding root causes and proactively hunting for "process leaks" and undefined variables. They validate assumptions with data rather than anecdotes.

● Coach & Change Leader: Possesses high emotional intelligence and the teaching ability to re-program legacy habits. They can explain why a new system is better to resistant teams and guide them through the transition with patience and clarity.

● Ambiguity Simplifier: Has the ability to simplify ambiguity, taking chaotic business inputs and structuring the information into linear, standardized processes.

● Translator & Data-Centric Communicator: Has strong communication skills to fluently bridge the gap, explaining technical constraints to non-technical stakeholders in plain English.

● Detail-Oriented: Is highly detail-oriented, obsessed with consistent naming conventions and data definitions. They notice misalignment in data definitions immediately, ensuring organizational clarity and data integrity.


Experience & Educational Requirements & Preferences

Experience & Educational Background

● 7+ years of experience in Business Operations, Systems Administration, or Data Analysis.

● Bachelor’s degree in Business, Information Systems, or related field required, Master's degree preferred.

● People Management and Team Building


Platform Expertise & Architecture

● Low-Code/No-Code Mastery: Advanced proficiency with Low-Code/No-Code platforms ( , Airtable, Salesforce) is required, including the management of complex automation rules, dependencies, and integration webhooks.

● Business Object Modeling / Relational Database Design: Proven experience designing relational database schemas (One-to-Many, Many-to-Many), specifically including the ability to translate flat spreadsheets into relational objects (e.g., separating "Orders" from "Line Items").

● API & Integration Knowledge: Ability to read API documentation to understand system

capabilities/limitations.

● Lightweight Scripting & Automation (Preferred): Proficiency with basic data-related scripting (Python, SQL) or advanced spreadsheet macros (VBA) to independently manipulate datasets or prototype logic is a strong plus.


Process, Intelligence, & Change Management

● Business Process Modeling (BPM): Experience with Business Process Modeling (BPM), including creating detailed swimlane diagrams to visualize hand-offs and defining strict "Entrance and Exit Criteria" for process stages.

● Business Intelligence (BI) & Reporting: Proficiency in designing Business Intelligence (BI) dashboards and reports, with an understanding of how to structure data for customer segmentation and cohort analysis.

● Change Management & Training: Experience managing change, designing rollout plans, and creating training materials and SOPs for users in a fast-paced environment.

Not Specified
Software Development Engineer in Test (SDET)
✦ New
Salary not disclosed
Okemos, MI 6 hours ago

Job Summary:

We are seeking a forward-thinking SDET to help modernize and lead our test automation strategy. This role will focus on building and maintaining scalable, maintainable, and integrated test automation frameworks across UI and API layers using modern tools like Playwright and TypeScript, while also contributing to CI/CD testing integration. The SDET will also play a key role in supporting manual testing efforts within Agile feature teams—guiding test case design, exploratory testing, and quality validation for areas not yet automated. This position is critical to enabling feature teams to take ownership of both automated and manual testing, ensuring faster, higher-quality releases.

Primary Responsibilities:

· Partner with Agile feature teams to understand user stories, define acceptance criteria, and promote a test-first mindset through collaboration in design and refinement sessions.

· Build, maintain, and evolve test automation frameworks using Playwright (preferred), TypeScript, or other enterprise-approved tools to support API and UI testing.

· Drive the transition from legacy frameworks (e.g., Selenium + Java, Postman/Newman) to unified automation aligned with our CI/CD strategy.

· Collaborate with developers and QA engineers to ensure test cases are executed in CI pipelines and provide fast, actionable feedback.

· Support feature teams with manual testing efforts when needed, including test case design, exploratory testing, and validation of complex workflows that are not yet automated.

· Develop and maintain automated API tests (REST/SOAP) and end-to-end tests that validate functional and non-functional requirements.

· Participate in code reviews and contribute to test architecture decisions to ensure reliability, reusability, and scalability of test assets.

· Write and maintain SQL queries to validate data integrity and support test data creation strategies.

· Serve as a quality engineering champion—help guide automation strategy, mentoring peers, and contributing to continuous improvement of QA practices.

· Ensure traceability between requirements, tests, and defects, and support compliance with enterprise policies (e.g., HIPAA, security, audit readiness).

Preferred Experience:

· Hands-on experience with Playwright (TypeScript preferred) or similar frameworks like Cypress.

· Migration experience from Selenium + Java or Postman to modern frameworks.

· Familiarity with test reporting, dashboarding, and quality metrics in a DevOps environment.

· Experience with test case management tools (e.g., qTest, QMetry, TestRail) and requirements traceability.

Not Specified
AI Engineer
✦ New
Salary not disclosed
Greenwich, CT 6 hours ago

We are looking for a highly motivated AI Engineer to join our IT team. This role is ideal for someone passionate about building real-world AI solutions and eager to work across the full AI technology stack—from model integration and retrieval pipelines to agentic AI workflows, multi-agent orchestration, and application-level features used by business teams. You will also contribute to data engineering efforts that feed AI capabilities, working alongside a modern analytics platform built on Microsoft Fabric.


As an AI Engineer, you will help design, develop, and deploy AI capabilities. You will contribute to production-grade AI features in areas such as Open-to-Buy planning, Sales Forecasting, Intelligent Order Management Systems (OMS), Product Copy Generation, and Image Generation.

This is a unique opportunity to work on meaningful, high-impact AI initiatives while implementing modern AI infrastructure, LLMOps practices, and scalable system design.


This role will work from our Greenwich, CT office and report to the Senior Director of System Integration & Operation on our current hybrid schedule, 3 days in office and 2 days remote.


Key Responsibilities:


AI Application Development

Build and maintain AI-powered features including:

  • Open-to-Buy optimization and inventory planning models
  • Sales forecasting and demand prediction solutions
  • Intelligent OMS features for routing, allocation, and automation
  • Marketing AI tools such as product copy generation and AI-assisted image generation

Integrate custom and foundation LLMs into internal applications using API and SDK interfaces, leveraging structured outputs, function/tool calling, and prompt caching to optimize reliability and cost.


RAG, GraphRAG, + Vector DB Engineering

  • Develop retrieval pipelines using vector embeddings and similarity search (Azure AI Search, FAISS, Pinecone, or equivalent).
  • Implement chunking, embedding, indexing, query routing, and relevance-tuning strategies, including advanced reranking and hybrid search techniques.
  • Maintain a high-quality knowledge base to support AI features via Retrieval-Augmented Generation.
  • Explore and implement GraphRAG patterns to improve knowledge retrieval over structured enterprise data and entity relationships.


AI Agents & Orchestration

  • Design and build AI agents capable of planning, tool use, and multi-step reasoning using frameworks such as LangGraph, PydanticAI, CrewAI, or Google ADK.
  • Implement Model Context Protocol (MCP) and Agent-to-Agent (A2A) protocol integrations to connect AI agents with internal tools, APIs, data systems, and other agents in a standardized, interoperable way.
  • Build guardrails, evaluation frameworks, and human-in-the-loop checkpoints to ensure reliable and safe agent behavior in production.


AI Infrastructure & System Architecture

  • Maintain private cloud LLM instance landscape, ensuring secure and efficient usage.
  • Assist in deploying scalable inference pipelines, batching, and caching layers.
  • Collaborate with DevOps and Data Engineering on CI/CD, model deployment workflows, monitoring, and integration with the Microsoft Fabric data platform (including Fabric MCP for agent-to-data connectivity).


Data Engineering, Pipelines & Model Training

  • Clean, transform, and prepare datasets for ML/AI pipelines; contribute to data engineering workflows including ELT pipeline design, medallion architecture patterns, and data transformation within the Lakehouse layer.
  • Train, validate, and fine-tune models where appropriate (LLMs, forecasting models, classification models, etc.); familiar with parameter-efficient techniques such as LoRA and QLoRA.
  • Evaluate model performance and optimize latency, accuracy, and cost using LLM evaluation and observability frameworks (e.g., RAGAS, LangSmith, Langfuse, Helicone, or custom evals); manage prompt versioning and regression testing.


Required Qualifications:

  • Bachelor’s degree in Computer Science, Data Science, AI/ML, Engineering, or related field.
  • Strong foundations in Python, data structures, and machine learning concepts.
  • Comfortable working with LLM APIs, embeddings, vector databases, and RAG patterns; exposure to agentic patterns, tool use, and GraphRAG concepts is a strong plus.
  • Familiarity with cloud environments (Azure preferred; AWS or GCP also acceptable).
  • Understanding of systems diagrams, architecture patterns, and AI infrastructure components.
  • Exposure to SQL/NoSQL databases.
  • Exposure to data engineering concepts such as ELT/ETL pipelines, data transformation, and data modeling.
  • Awareness of responsible AI principles including bias detection, fairness, and model interpretability.
  • Awareness of AI agent frameworks and orchestration concepts (e.g., LangGraph, PydanticAI, Semantic Kernel, CrewAI, or Google ADK).
  • Familiarity with prompt engineering best practices including chain-of-thought, few-shot prompting, and structured output design.


Preferred Qualifications:

  • Familiarity with Microsoft Fabric (OneLake, Lakehouse, Spark notebooks, semantic models) and Power BI; experience with Fabric MCP integrations is a strong differentiator.
  • Experience implementing MCP (Model Context Protocol) servers or A2A (Agent-to-Agent) protocol endpoints, or integrating AI agents with external tools and APIs.
  • Exposure to multimodal AI capabilities (vision-language models) for applications such as product image analysis or document understanding.
  • Experience building small AI apps, demos, or tools—portfolio/GitHub encouraged.


What you'll Gain:

  • Hands-on impact in designing enterprise AI capabilities from the ground up.
  • Opportunities to work with cutting-edge LLM technologies in a private, secure environment, alongside a modern Microsoft Fabric data platform.
  • A chance to shape AI products used across supply chain, marketing, and e-commerce.


Company Overview:

Established in 2005, Marc Fisher Footwear company is a leading full-service, product-driven fashion footwear company with knowledge and expertise in design, sales, sourcing, distribution and marketing – all with dedicated and strategic direction for each brand within the portfolio, which includes GUESS, G by Guess, Nine West, Tommy Hilfiger, Earth, Calvin Klein, Kenneth Cole Men's, Hunter Boots, Rockport, Bandolino, indigo rd., Unisa, and Easy Spirit along with the namesake brands – Marc Fisher and Marc Fisher LTD.


Our diverse portfolio of globally recognized brands – available domestically and internationally via wholesale and retail channels – consistently meets the widest range of consumers’ fashion footwear needs, from classic to contemporary, sport to dress, men’s to women’s. Headquartered in Greenwich, Connecticut, with showrooms in New York City, Marc Fisher Footwear is sold worldwide through department stores, specialty stores and e-commerce channels.


Marc Fisher Footwear is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, sexual orientation, age, status as a protected veteran, among other things, or status as a qualified individual with a disability. EEO Employer/Vet/Disabled.

Not Specified
Technical Program Manager
✦ New
Salary not disclosed
San Diego, CA 6 hours ago

Technical Program Manger

6 Month Contract role

PST- Remote- Tier 2/3 Locations

Work hours- Pacific

Payrate- $50/hr on W2 max



Role Overview: You’ll work on and lead sophisticated cross-organizational programs working with partners to build roadmaps, plan requirements, manage timelines, identify risks, and communicate clearly with cross-functional partners across the company.


Must have skills:

• Working in Agile and scrum methodologies.

• Working at the intersection of engineering and product launch web applications/products.

• Agile and planning tools: Jira (Must have) Nice to haves include JQL, Airtable, , SQL, Tablu, Jellyfish and other analytic tools.

Note: This worker needs to have skillsets/experience in a software environment. If their skillset is primarily in a hardware product environment, then they are not the candidate we are looking for

Interview process:

• 30 min filter interview with Hiring manager

• 45-60 min panel interview with 2 team members to cover TPM skillset


You’ll work on and lead sophisticated cross-organizational programs working with partners to build roadmaps, plan requirements, manage timelines, identify risks, and communicate clearly with cross-functional partners across the company.

We are looking for a diligent, self-organized, and motivated individual with the ability to work independently through everyday tasks and challenges. Candidate must have knowledge of product lifecycle management and experience in agile execution. In addition, it is important for this role to have technical knowledge to be able to influence and lead technical programs.

Successful candidate must be at ease working in a cross-functional and globally distributed team with high emphasis on successful and timely delivery. The team is committed to diversity and inclusion, we love connecting people from different backgrounds, perspectives and geographies!


Responsibilities:

· Define program objectives with key business partners, key products to be delivered and develop project specifications, agree project plan baseline including scope, key activities, deliverables, resource requirements, dependencies, time and constraints.

· Partner with engineering and product leaderships to drive consistency in delivering quality products through agile processes at scale

· Set up roadmaps, project plans and schedules.

· Manage relationships amongst key partners by building confidence and trust with clear professional communications on all management levels and assured expectation management.

· Drive program execution; track delivery; expect, monitor and control change, own scope management and risk management; proactively seek and resolve blockers through effective collaboration.

· Conduct progress reviews to assess project outcomes, build confidence that projects will deliver to time, budget and agreed standards. Provide timely, consistent and accurate reporting of the status of initiatives to stakeholders. Develop and deliver necessary presentations including supporting documentation to all levels throughout the organization.

Basic Qualifications:

· BA/BS degree required (technical degree preferred).

· At least 6 years of experience with program management ( technical preferred).

· Strong process orientation as well as business acumen and communication skills.


Essential Skills Required:

· Experience moving technical or engineering programs and products from inception to delivery

· Good stakeholder management skills across all levels of hierarchy

· Good knowledge in simplifying/automating ways of working

· Proficient in analytical and problem-solving skills.

· Experience with collaboration, planning and project management tools (e.g. Airtable, JIRA, Confluence, dashboarding with queries and rich filters).

· Deep knowledge of product lifecycle management.

· Extreme attention to detail and precision in producing quality output.

· Proficient use of Google Office Suite (Docs, Sheets, and Slides)

Bonus Skills:

· Background in digital marketing domain or technology

· Scrum, Agile or Lean certification(s)

Not Specified
jobs by JobLookup
✓ All jobs loaded