Json Api Include Example Jobs Hiring Now Jobs in Usa

18,417 positions found — Page 5

Sr AI Application Developer
Salary not disclosed
Milwaukee, WI 3 days ago

At Rite-Hite, your work makes an impact. As the global leader in loading dock and door equipment, we design and deliver solutions that keep our customers safe, secure, and productive. Here, you'll find innovation, stability, and the chance to grow your career as part of a team that's always looking ahead.

ESSENTIAL DUTIES AND RESPONSIBILITIES

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily.

  • Design and build AI-powered applications using Large Language Models (LLMs) for enterprise use cases.
  • Develop Retrieval-Augmented Generation (RAG) solutions using structured and unstructured enterprise data such as documents, manuals, tickets, ERP data, and knowledge bases.
  • Build and orchestrate AI agents that can reason, plan, and interact with tools, APIs, and workflows.
  • Implement guardrails for AI systems including prompt safety, data protection, hallucination mitigation, access control, and output validation.
  • Work with multimodal AI models including text, image, and video use cases such as video analysis, summarization, and optimization.
  • Integrate AI solutions with existing enterprise systems such as Salesforce, ERP platforms, data lakes, APIs, and internal applications.
  • Partner with security and compliance teams to ensure responsible AI usage, data privacy, and governance.
  • Prototype quickly, then harden solutions for production with monitoring, logging, evaluation, and performance optimization.
  • Mentor and upskill existing developers on AI concepts, patterns, and best practices.

Required Skills & Experience

  • 5+ year of full stack development experience.
  • Strong software engineering background with experience building production-grade applications.
  • Hands-on experience with modern LLM platforms such as OpenAI, Azure OpenAI, Anthropic, or similar.
  • Practical experience building RAG pipelines using vector databases and embedding models.
  • Experience with prompt engineering, prompt versioning, and evaluation techniques.
  • Solid Python experience for AI development.
  • Experience integrating AI services with REST APIs, microservices, and cloud-native architectures.
  • Familiarity with cloud platforms such as AWS or Azure, including deployment, scaling, and security concepts.
  • Understanding of data formats such as JSON, XML, and document-based data.
  • Ability to translate business problems into AI-driven technical solutions.

Preferred Qualifications

  • Experience with vector databases such as Pinecone, FAISS, Weaviate, or similar.
  • Familiarity with frameworks such as LangChain, LlamaIndex, Semantic Kernel, or equivalent orchestration tools.
  • Experience implementing AI safety controls, policy enforcement, and evaluation frameworks.
  • Exposure to video or image models and multimodal AI use cases.
  • Experience working in enterprise environments with security, compliance, and change management considerations.
  • Prior experience mentoring or leading developers in new technical domains.

What We Offer

At Rite-Hite, we take care of our people - because when you're supported, you can do your best work. Our benefits are designed to support your health, your future and your life outside of work:

  • Health & Well-being: Comprehensive medical, dental, and vision coverage, plus life and disability insurance. A robust well-being program with an opportunity to receive an extra day off and more.

  • Financial Security: A strong retirement savings program with 401(k), company match, and profit sharing.

  • Time for You: Paid holidays, vacation time, and personal/sick days each year.

Join us and build a career where you're supported - at work and beyond.

Rite-Hite is proud to be an Equal Opportunity Employer. We consider all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other protected characteristic under federal, state, or local law.In accordance with VEVRAA, we are committed to providing equal employment opportunities for protected veterans.We are also committed to maintaining a drug-free workplace for the safety of our employees and customers.

Not Specified
Workday Integration Lead - EIB-Dallas, Texas , Need Locals only
✦ New
Salary not disclosed
Dallas 1 day ago
Job Title- Workday Integration Lead- EIB Location Dallas, Texas , Need Locals only Contract 1 year Mandatory Skills : Workday, Hyperion, Boomi Technical Skills Strong experience in enterprise integration development.

Hands on expertise with Workday integrations (Studio, EIB, RaaS, APIs).

Experience integrating Payroll systems (PECI/DT), Finance/ERP, and third party vendors.

Knowledge of REST/SOAP APIs, JSON, XML, flat files, SFTP, and middleware concepts.

Familiarity with data transformation, validation, and reconciliation techniques.

Understanding of security, encryption, and audit logging in integrations.

Functional Knowledge HR and Payroll data domains (worker, job, compensation, benefits, time, payroll results).

Finance integration concepts (GL, cost centers, headcount, budgeting, reporting).

Vendor integration lifecycle and dependency management.

Tools and Platforms (Preferred) Workday Studio, EIB, RaaS.

Integration middleware (Boomi, MuleSoft, or similar optional).

Snowflake / Data Warehouse integrations (preferred).

Monitoring and logging tools (Splunk or equivalent preferred.

Thanks and Regards, Team Lead Mahesh Kumar Direct No: 949-201-1313 Yochana Solutions INC Windsor, Ontario- Canada Farmington hills, MI-48335- USA USA | CANADA I Mexico | INDIA W: Note: This is not an unsolicited mail.

If you are not interested in receiving our e-mails then please reply with subject line Remove Workday, Boomi, Hyperion
Not Specified
Software Engineer in Test
Salary not disclosed
Mount Laurel, NJ 2 days ago

Job Title: SDET / QA Automation Engineer

Location: Mount Laurel- NJ

Duration: Long Term


Job Description:

Job Summary:

We are seeking a highly skilled and experienced SDET / QA Automation Engineer with 8 to 10 years of expertise in Python, JavaScript, and modern automation frameworks. This position involves developing automation solutions, microservices, and test scripts while validating end‑to‑end network components and their behavior. The candidate should have strong domain knowledge in networking and cable technologies, with the ability to collaborate effectively with clients and cross‑functional teams.


Key Responsibilities:

  • Develop microservices using Python, NodeJS, and Golang as part of automation and service validations.
  • Develop standalone Python/NodeJS scripts to simulate network traffic and validate performance across different endpoints.
  • Create Proof of Concepts (POCs) based on client needs and actively participate in client demos and technical discussions.
  • Lead the creation of test strategies and manage test environments with both physical and virtual device setups.
  • Create comprehensive test scenarios and automated test scripts using MochaJS, ensuring robust coverage of functional, integration, and regression test cases.
  • Design reusable test components, validate API and microservice behavior, and integrate MochaJS test suites into the existing automation framework to enhance reliability and execution efficiency.
  • Collaborate with cross‑functional teams to refine requirements, improve test coverage, and ensure smooth integration with CI/CD pipelines.
  • Gather requirements and perform detailed analysis for new automation scenarios and test case development.
  • Support manual and automation testing across applications, devices, and servers as required.
  • Ensure code quality using tools like SonarQube and adhere to strict QA standards.
  • Provide technical guidance, troubleshooting support, and mentorship to team members on tasks and issues raised by the client.
  • Maintain version control and branching strategies using GitHub, ensuring high code integrity and traceability.
  • Monitor automation execution, analyze failures, and drive root‑cause investigations to improve overall product quality.
  • Document technical workflows, automation processes, and test scenarios to ensure long-term maintainability and knowledge sharing.


Required Skills & Experience:

  • 8-10 years of experience in QA/SDET automation roles.
  • Strong programming knowledge with Python and JavaScript.
  • Good hands-on experience with Go Lang and NodeJS.
  • Hands-on experience with MochaJS for scripting and automated testing.
  • Excellent knowledge with web technologies like REST, SOAP, XML and JSON
  • Proficiency in API testing using Bruno/ Postman.
  • Familiarity with GitHub for version control and Jira for project tracking.
  • Excellent domain knowledge in Network and cable domain
  • Should be familiar with IMS architecture and SIP protocols.
  • Good problem-solving and debugging skills.

Should have good communication and client interaction skills.

Not Specified
Vermilion Specialist
✦ New
Salary not disclosed
Newark, NJ 1 day ago

ROLE_DESCRIPTION -


Platform Configuration: Implement, configure, and customize the Vermilion Reporting Suite (VRS) to meet specific client investment reporting needs.


Technical Integration: Design and manage data interfaces between VRS and external sources (e.g., SQL Server, Oracle, Markit EDM, Aladdin, API/XML/JSON).


Workflow Automation: Build, test, and maintain automated reporting workflows and batch processing.


Report Development: Design high-quality, branded reports and templates (PDF, MS Office, HTML) for various asset classes.


Testing & Troubleshooting: Conduct system testing, data validation, and troubleshooting to ensure accuracy and platform performsce.


Required Technical & Professional Skills


Core Technical Skills: Strong SQL (MS SQL/Oracle), ETL processes, and database debugging skills.

VRS Experience: Deep knowledge of Vermilion modules, APIs, and report design.


Financial Knowledge: Understanding of performance measurement, client reporting, and portfolio data (Fixed Income, Equity, Derivatives).

Not Specified
Interoperability Specialist
Salary not disclosed
Little Rock, AR 2 days ago

Job Title: Interoperability Specialist

Location: Little Rock, AR / Remote


Company Overview

At AngelEye Health, our mission is to empower families to improve the clinical outcomes of their loved ones. We provide a HIPAA-compliant family engagement platform that integrates parents into the child’s care team. From our bedside camera systems to our feeding and discharge management tools, we aim to reduce family stress and improve the patient journey from admission to home.


Position Summary

We are looking for a skilled Interoperability Specialist to join our technical team. In this role, you will be the bridge between AngelEye’s platform and the complex Electronic Health Record (EHR) environments of our hospital partners. You will design, develop, and maintain critical data interfaces using Mirth Connect and APIs, ensuring that vital patient information flows seamlessly to support our family-centered care solutions.


Key Responsibilities

  • Interface Development: Build, test, and deploy HL7 interfaces using Mirth Connect to facilitate data exchange between AngelEye and hospital EHR systems.
  • Custom Scripting: Use JavaScript within Mirth to handle complex message transformations, data mapping, and custom logic for non-standard clinical data.
  • EHR Integration: Act as the technical lead for integrations with Epic (Bridges), Cerner (Open Engine), and other major EHR providers.
  • Clinical Data Workflows: Configure and troubleshoot specific message types, including:
  • ADT: Admission, Discharge, and Transfer events.
  • Orders: Ensuring clinical orders trigger appropriate platform actions.
  • Flowsheets: Capturing data and clinical observations for automated documentation.
  • Technical Support: Monitor existing channels, perform root-cause analysis on interface failures, and maintain interface systems.
  • Stakeholder Collaboration: Work closely with hospital IT teams, project managers, and clinical staff to ensure technical builds align with clinical workflows.


Technical Qualifications

  • Integration Engine: Minimum 3–5 years of hands-on experience with Mirth Connect (NextGen Connect).
  • Coding: Proficiency in JavaScript for writing custom transformers, filters, and global scripts.
  • Standards: Expert knowledge of HL7 v2.x; familiar with FHIR, JSON, and REST APIs.
  • EHR Ecosystems: Direct experience working with Epic and Cerner integration environments.
  • Database Skills: Ability to write SQL queries to validate data and troubleshoot backend issues.


Core Competencies (Personal Skills)

  • Communication: Ability to translate complex technical jargon into clear language for non-technical hospital staff and internal leadership.
  • Problem-Solving: A "PhD-level execution" mindset—you don’t just fix the symptom; you solve the root cause.
  • Urgency & Ownership: A strong sense of urgency in resolving issues that impact family-patient connectivity.
  • Empathy: A deep alignment with our mission to support families during their most challenging times in the NICU/PICU.
Not Specified
Software Engineer
✦ New
Salary not disclosed
Redmond, WA 11 hours ago

Job Title: Software Engineer II

Location: Redmond, WA

Contract: 12 months

Pay Rate: $87.29/hr, W2

Benefits: Medical, Dental, Vision and Weekly Pay


Job Summary(2-4 Years):

We are seeking a highly skilled and motivated Software Engineer to join our specialized engineering team. This role is centered on the development of sophisticated software for advanced hardware control and lab automation, with a primary focus on aero-acoustic wind tunnel systems. In this role, you will use Python to design, build, and enhance control mechanisms for both a classic recirculating wind tunnel and a Client modular fan-array wind tunnel. This position offers a unique and exciting opportunity to work at the intersection of software development, robotics, acoustics, and aerodynamics.


Key Responsibilities:

  • Design, develop, and maintain high-quality, reusable, and reliable Python code for controlling complex hardware systems, including wind tunnel fan arrays and associated mechanical components.
  • Implement and optimize control algorithms for real-time performance and precision, including PWM control for fan motors and actuators.
  • Develop software to program and generate a variety of airflow conditions, including laminar, turbulent, gradient, and time-varying flows.
  • Create and manage control interfaces for secondary systems, including robotic HATS (Head and Torso Simulator) movers, lift systems, and multi-channel spatial audio (HOA) setups.
  • Integrate data from motion capture systems (e.g., Optitrack) for real-time tracking and system alignment.
  • Develop and execute automated procedures for the calibration of wind speeds, acoustic sensors, and other critical measurements.
  • Build and maintain data pipelines for capturing, processing, and analyzing experimental data from a wide range of sensors.
  • Troubleshoot and resolve complex software and system-level issues to ensure high availability and reliability of all lab equipment.
  • Produce clear and comprehensive documentation for software architecture, APIs, and operational procedures.


Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Mechanical Engineering, Robotics, or a related field.
  • Proven professional experience in Python programming with a strong emphasis on hardware control, lab automation, or robotics.
  • Solid understanding of control systems theory, digital signal processing, and data acquisition principles.
  • Strong foundational knowledge of fluid mechanics, aerodynamic principles, and acoustic measurement techniques.
  • Experience working in a laboratory or R&D environment is highly desirable.
  • Demonstrated ability to debug complex, multi-component systems that include both hardware and software.
  • Excellent communication and collaboration skills, with an ability to work effectively in a multidisciplinary team.


Technical Skills

Required:

  • Proficiency in modern Python (3.8+) and object-oriented design.
  • Experience with scientific computing and data analysis libraries (NumPy, SciPy, Pandas).
  • Experience with hardware control interfaces and protocols (e.g., PWM, serial, Ethernet).
  • Experience with libraries for audio signal processing or multi-channel data acquisition (e.g., sounddevice, librosa).
  • Competency with version control systems, particularly Git.
  • Experience using configuration file formats like YAML or JSON.
  • Knowledge of network communication protocols and experience with REST APIs.


Preferred:

  • Experience with motion control systems for robotics or automation.
  • Familiarity with spatial audio technologies, particularly High-Order Ambisonics (HOA).
  • Experience integrating motion capture systems (e.g., Optitrack) into control software.
  • Experience with advanced aerodynamic measurement techniques such as Particle Image Velocimetry (PIV).
  • Familiarity with GUI development frameworks for creating internal tools.
  • Experience with cross-platform software development (Windows, macOS, Linux).


Pursuant to the California Fair Chance Act, Los Angeles County Fair Chance Ordinance for Employers, Los Angeles Fair Chance Initiative for Hiring Ordinance, and San Francisco Fair Chance Ordinance, qualified applicants will be considered for assignment with arrest and conviction records. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness, meet client expectations, standards, and accompanying requirements, and safeguard business operations and company reputation.

#TMN

Not Specified
Data Engineer
✦ New
Salary not disclosed

Data Engineer

Our client is seeking a Data Engineer to take ownership of end-to-end data processes within a growing, values-driven organization. This individual will play a key role in ensuring data is accurate, reliable, and actionable across the business. The ideal candidate is hands-on, detail-oriented, and comfortable working across the full data lifecycle—from ingestion and transformation to reporting and stakeholder enablement.


This role is a hybrid model in Portland, Oregon or Lakewood, Washington.


Data Engineer Responsibilities

  • Own data quality across systems by identifying, troubleshooting, and resolving inconsistencies and inaccuracies.
  • Design, build, and maintain scalable ETL/ELT pipelines to transform raw data into clean, structured datasets.
  • Manage data ingestion processes from multiple internal and external sources, including APIs and databases.
  • Develop and optimize SQL queries, data models, and schemas to support analytics and reporting needs.
  • Create and maintain dashboards and reports in Power BI, ensuring they are accurate, user-friendly, and actionable.
  • Partner with business stakeholders to translate requirements into data solutions and meaningful insights.
  • Monitor pipeline performance and reliability, proactively addressing failures and inefficiencies.
  • Contribute to data architecture design, including data lake structure and best practices.
  • Document data sources, transformations, and workflows to support transparency and scalability.
  • Collaborate cross-functionally with engineering and business teams to support data-driven decision making.


Data Engineer Qualifications

  • 3+ years of experience in a data engineering, analytics engineering, or similar role with ownership of data pipelines and reporting.
  • Strong proficiency in SQL, including complex queries, joins, and performance optimization.
  • Hands-on experience with Python for data transformation, scripting, and automation.
  • Proven experience building and maintaining Power BI dashboards, including data modeling and DAX.
  • Experience designing and managing ETL/ELT processes and understanding when to apply each approach.
  • Familiarity with cloud-based data platforms, preferably within a Microsoft ecosystem (e.g., Azure Data Factory, Synapse, or similar tools).
  • Experience working with data lakes and modern data architecture concepts.
  • Ability to work with APIs and semi-structured data formats such as JSON.
  • Strong communication skills with the ability to explain data concepts to non-technical stakeholders.
  • Detail-oriented with a strong sense of ownership and accountability for data accuracy.


Preferred:

  • Experience with ERP or CRM systems as data sources (e.g., Microsoft Dynamics environments).
  • Familiarity with transformation frameworks such as dbt.
  • Experience working in smaller, collaborative teams with broad responsibilities.
  • Background supporting financial or operational data where accuracy is critical.
Not Specified
Sr. Software Developer (MACESS)
Salary not disclosed
Oakland, CA 2 days ago

Job Role : Sr. Software Developer (MACESS)

Location: Oakland, CA - Onsite

Duration: 12+ Month Contract

Only W2


Job Description

Highly skilled Senior Application Developer to lead the Design, Development, and maintenance of Enterprise workflow systems, specifically focusing on the FIS MACESS platform. In this role, the SME will be responsible for optimizing the complex document management and workflow processes, integrating MACESS with internal and external systems, and mentoring junior developers. You will act as the technical subject matter expert to ensure Business processes are automated, scalable and secure.


Technical Qualifications:

  • MACESS Expertise: Experience with FIS MACESS is desirable with (Design, workflow, imaging, and system administration.
  • Programming: Proficiency in VB Script, C# / .NET or Java, with experience building custom extensions or utilities for MACESS
  • Database: Advanced knowledge of SQL Server, ability to manage large-scale data sets and complex schemas.
  • Web Services: Experience with SOAP/RESTful APIs and XML/JSON data structures
  • Scripting: Strong PowerShell, VBScript, or Python scripting skills for task automation
  • Methodology: Proven experience working in an Agile/Scrum environment with a focus on CI/CD pipelines

Education & Experience:

  • Bachelor’s degree in computer science, Information Technology, or a related field
  • 4+ years of total software development experience
  • Experience in the Healthcare or Insurance industry


Thanks,

Rahul Gupta

Direct : (732) 743-7543

Email:

Not Specified
Twilio Segment CDP Architect
✦ New
Salary not disclosed
Bethlehem, PA 1 day ago

Segment CDP Platform Consultant Architect (Contract)


KEY POINTS

• Lead architecture & governance of a Twilio Segment CDP platform

Bethlehem location

• Senior contract role owning data models, identity + CDP operations


ABOUT THE CLIENT

We’re supporting a large organisation investing in customer data infrastructure and modern MarTech.

They are building a Twilio Segment-powered CDP to improve data governance, audience activation, and real-time customer insight.

Due to continued growth, they’re looking for a Segment CDP Architect to take ownership of the platform’s design, governance, and day-to-day operations.


THE ROLE

Senior, hands-on role owning the Segment CDP architecture and operations.

You’ll define the data model, identity strategy, and tracking standards, while ensuring the platform is scalable, compliant, and reliable.


Key responsibilities:

• Own CDP data model and identity resolution (Profiles / Unify)

• Define event schema, naming conventions, and tracking standards

• Configure and govern sources, destinations, and integrations

• Implement Protocols, tracking plans, and schema validation

• Build and optimise Engage audiences and traits

• Manage data quality, SLAs, and platform performance

• Ensure consent, privacy, and governance standards are met


ESSENTIAL SKILLS

10+ years in MarTech / CDP / marketing data

3+ years hands-on Twilio Segment

• Strong data modelling + identity resolution experience

• Technical capability with JSON, SQL, REST APIs

• Experience with Segment destinations + server-side tracking

• Proven data quality + platform optimisation experience

• Strong understanding of privacy and consent frameworks


NICE TO HAVE

• Warehouse-native CDP / Reverse ETL

• CMP / privacy tooling

• CI/CD for tracking plans

• Experience in regulated environments


TO BE CONSIDERED

Please either apply by clicking online or email me directly at . I can make myself available outside of normal working hours between 7am – 8pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process and submit (subject to required skills) your application to our client in conjunction with this vacancy only.


KEY SKILLS

Twilio Segment / CDP / Customer Data Platforms / MarTech Architecture / Data Modelling / Identity Resolution / Segment Protocols / Audience Activation / Privacy & Consent / Data Governance / Marketing Technology

Not Specified
SAP BTP CPI Consultant
✦ New
Salary not disclosed
Dallas, TX 11 hours ago

Roles & Responsibilities

Responsibilities: We are looking for a motivated SAP BTP – IS /CPI Consultant with a passion for the manufacturing industry. As an SAP BTP - CPI Consultant, you will be responsible for assisting in various projects related to SAP BTP - CPI implementation tasks. You will work closely with a team of experienced professionals to implement & support SAP BTP-CPI solutions.


Key Responsibilities:

Implementation: Configuring, implementing, and maintaining the SAP BTP-CPI module

Experience in developing inbound and outbound interfaces to/from the cloud to On-Premise and Cloud instances of SAP Products

Experience in configuration and extension of standard iFlows.

Experience in defining custom iFlows, local & exception sub-processes, exception handling

Expertise in handling various integration adapters in SAP CPI (SFSF, ODATA, SFTP, IDOC, SOAP, HTTP, Process Direct, REST Adapters) to exchange messages between sender and receiver

Knowledge on developing value mapping and Groovy and using it in iFlow

Experience in handling different data conversions like JSON to XML, CSV to XML, etc.

Experience in using various CPI pallet options (integration patterns – message transformations, enricher, splitter, etc.) and experience in SAP BTP Experience in handling security artifacts, encryption, and decryption mechanisms and SSH keys

Experience with EDI integrations, API management.

Design interfaces and integration flows, and develop solutions to meet business needs

Experience in SAP Cloud Platform Integration, SAP HANA Cloud Integration, and SAP Process Orchestration

Knowledge of SAP Cloud Connector and CPI cockpit

Provide technical support and troubleshooting for applications developed using SAP CPI

Testing: Creating test data, running tests, and creating and executing test scripts

Continuous improvement: Monitoring system performance, identifying opportunities for improvement, and recommending enhancements

Not Specified
jobs by JobLookup
✓ All jobs loaded