Engineering Structures Jobs Full Time Jobs in Summit
514 positions found — Page 11
Operations Technician I
Location: Rahway, NJ Work Environment: On-Site
Job Summary
The Operations Technician I is an entry-level professional responsible for executing technical tasks and supporting critical operational activities. This role is essential to the timely packaging, labeling, warehousing, and distribution of bulk and finished clinical materials.
The successful candidate will ensure full compliance with Quality-related aspects of Global Clinical Supply Operations, including inventory management, SOP authoring, and supporting internal inspections.
Key Responsibilities
Clinical Supply & Logistics
- Perform primary and secondary packaging of drug products, biologics, and vaccines.
- Execute distribution activities, including order processing and drug picking, packing, and shipping.
- Manage all tasks related to clinical label printing and production.
- Oversee warehousing activities, including bulk component inventory movement and accuracy.
- Maintain optimal inventory levels for consumables through proactive ordering.
Compliance & Quality Assurance
- Maintain cGMP and safety training to site requirements at all times.
- Author or revise Standard Operating Procedures (SOPs) and support batch record reconciliation.
- Support investigations into root causes and assist in the creation of Corrective and Preventative Actions (CAPAs).
- Act as a Subject Matter Expert (SME) during internal or external audits from regulatory agencies and safety bodies.
Technical Operations
- Operate within cold vaults, walk-in CTUs, and stand-up CTUs.
- Execute assigned technical tasks with a high degree of reliability and follow-through.
- Utilize SAP for technical activities and maintain accurate, detailed documentation.
- Engage in creative problem-solving and provide analysis to anticipate technical obstacles.
Qualifications & Skills
Education & Experience
- Bachelor’s degree preferred; candidates with relevant experience will be considered.
- 0–3 years of experience in a related field (or 3+ years of relevant experience for candidates without a degree).
- Experience using and wearing respiratory equipment.
- Training or experience in forklift and pallet jack operation.
Technical Knowledge
- Strong understanding of Good Manufacturing Practice (GMP) principles.
- Knowledge of quality and safety requirements for pharmaceutical packaging and handling.
- SAP experience is highly desired.
- Equipment operation and repair skills.
Physical Requirements
- Ability to repetitively lift, carry, push, and pull up to 50 lbs.
General Competencies
- Exceptional organizational skills and meticulous attention to detail.
- Strong problem-solving and troubleshooting abilities.
- Ability to quickly learn new systems and demonstrate in-depth knowledge of GMP processes.
Duration: 6+ months
Location: 100% Remote
Job Overview
The Marketplace Data Product Engineer serves as the primary technical facilitator, and adoption champion for the Marketplace platform. This role bridges engineering, product, and business domains - leading workshops, demos, onboarding sessions, and cross?domain engagements to accelerate Marketplace adoption. You will configure demo environments, support development, translate complex technical concepts for business audiences, gather product feedback, and partner closely with product and engineering teams to shape the Marketplace roadmap. This will guide domains through the process of understanding, showcasing, and maturing their data products within the ecosystem.
Key Responsibilities
- Facilitate workshops, demos, onboarding sessions, and cross?domain engagements to drive Marketplace adoption.
- Serve as the primary technical presenter of the Marketplace for domain teams and stakeholders.
- Engage with domain owners to understand their data products, help refine their articulation, and showcase how they integrate into the Marketplace ecosystem.
- Configure and maintain demo environments for Marketplace capabilities, data products, and new features.
- Support light development, proof?of?concept configurations, and sample integrations to demonstrate platform capabilities.
- Translate technical Marketplace concepts into clear, business?friendly language for non?technical audiences.
- Collect structured feedback from domain teams, synthesize insights, and partner with product and engineering to influence the roadmap.
- Develop and refine training materials, demos, playbooks, and onboarding assets to support continuous adoption.
- Act as an advocate for domains, ensuring their data product needs and challenges are well represented in Marketplace planning.
- Support ongoing adoption initiatives, including community sessions, office hours, and cross?domain knowledge sharing.
Required Skills & Qualifications
- 4-7+ years of experience in data engineering, platform engineering, solution engineering, technical consulting, or similar roles.
- Strong understanding of data products, data modeling concepts, data APIs, enterprise integrations and metadata?driven architectures.
- Ability to configure and demonstrate platform features, build light proofs?of?concept, and support technical onboarding.
- Excellent communication and presentation skills, with experience translating technical concepts for business partners.
- Experience facilitating workshops, leading demos, or driving customer/product adoption initiatives.
- Ability to engage domain teams, understand their data product needs, and help articulate value within a larger ecosystem.
- Strong collaboration and stakeholder management skills across engineering, product, and business teams.
- Comfortable working in fast?moving environments and driving clarity through ambiguity.
Preferred Qualifications
- Experience with data product and governance frameworks, data marketplaces, data mesh concepts, or platform adoption roles.
- Hands?on experience with cloud data platforms (Azure, AWS, or GCP), data pipelines, or integration tooling.
- Familiarity with REST/GraphQL APIs, event-driven patterns, and data ingestion workflows.
- Background in solution architecture, customer engineering, or sales engineering.
- Experience developing demo environments, sample apps, or repeatable platform enablement assets.
- Strong storytelling ability when explaining data product value, domain capabilities, and Marketplace patterns.
Duration: 12 Months (Temp to Hire)
Location: Newark, NJ 07102
Job Description:
Are you interested in building capabilities that enable the organization with innovation, speed, agility, scalability and efficiency? When you join our organization at Prudential, you'll unlock an exciting and impactful career - all while growing your skills and advancing your profession at one of the world's leading financial services institutions.
As a Data Scientist on/in the US Businesses PruAdvisors Data Science Team you will partner with Machine Learning Engineers, Data Engineers, Business Leaders and other professionals to build GenAI and ML models to improve advisor experience, perform lead scoring, and increase sales revenue. You will implement AI and machine learning models that will deliver stability, scalability and integration with other advisor products and services. You will implement capabilities to solve sophisticated business problems, deploy innovative products, services and experiences to delight our customers! In addition to deep technical expertise and experience, you will bring excellent problem solving, communication and teamwork skills, along with agile ways of working, strong business insight, an inclusive leadership attitude and a continuous learning focus to all that you do.
Responsibilities:
- Provide deep technical leadership to a portfolio of high impact data science initiatives involving sales and advisor experience. Identify the optimal sets of data, models, training, and testing techniques required for successful product delivery. Remove complex technical impediments
- Leverage your experience and skills to identify new opportunities where data science and AI can improve experiences, gain efficiencies, and generate sales.
- Manage team members in AI/ML and model development, testing, training, and tuning. Apply hands-on experience to ensuring best-in-class model development. Mentor team members in technical skill development and product ownership.
- Communicate clearly and concisely, in writing and verbally, all facets of model design and development. Continuously look for insights in models developed and generate new ideas for model improvement.
- Manage external vendors in the execution of parts of the data science development process as needed.
- Leverage continuous integration and continuous deployment best practices, including test automation and monitoring, to ensure successful deployment of ML models and application code on Prudential's AI/ML platform.
- Bring a deep understanding of relevant and emerging technologies, give technical direction to team members and embed learning and innovation in the day-to-day.
- Work on significant and unique issues where analysis of situations or data requires an evaluation of intangible variables and may impact future concepts, products or technologies.
- Familiarity with Python, SQL, AWS, and JIRA.
- Familiarity with LLMs, deployment of LLMs, RAG, LangChain, LangGraph, and Agentic AI concepts.
The Skills and expertise you bring:
- Applied Statistics, Computer Science, or Engineering or experience in related fields with a focus on machine learning, AI, and LLMs.
- Junior category industry experience with responsibility for developing and delivering advanced quantitative, AI/ML, analytical and statistical solutions.
- Ability to lead a small team with minimal guidance and effectively leverage diverse ideas, experiences, thoughts and perspectives to the benefit of the organization to deliver AI products.
- Ability to influence business stakeholders and to drive adoption of AI/ML solutions.
- Experience with agile development methodologies, Test-Driven Development (TDD), and product management.
- Knowledge of business concepts, tools and processes that are needed for making sound decisions in the context of the company's business
- Demonstrated ability to mentor and operational management of data science team based on project requirements, resourcing requirements, and planning dependencies as appropriate, anticipate risks and bottlenecks and proactively takes actions
- Excellent problem solving, communication and collaboration skills, and stakeholder management
- Significant experience and/or deep expertise with several of the following:
- Machine Learning and AI: Understanding of machine learning theory, including the mathematics underlying machine learning algorithms. Expertise in the application of machine learning theory to building, training, testing, interpreting and monitoring machine learning models. Expertise in traditional machine learning models (unsupervised, XGBoost, etc.) and Large Language Models (OpenAI, Claude).
- Model Deployment: Understanding of model development life cycle, CI/CD/CT pipelines (using tools like Jenkins, CloudBees, Harness, etc.), A/B testing, and pipeline frameworks such as AWS SageMaker, and newer AWS/Azure Agentic AI infrastructure products.
- Data Acquisition and Transformation: Acquiring data from disparate data sources using APIs and SQL. Transform data using SQL and Python. Visualizing data using a diverse tool set including but not limited to Python.
- Database Management Systems: Knowledge of how databases are structured and function in order use them efficiently. May include multiple data environments, cloud/AWS, primary and foreign key relationships, table design, database schemas, etc.
- Data Analysis and Insights: Analyzing structured and unstructured data using data visualization, manipulation, and statistical methods to identify patterns, anomalies, relationships, and trends.
- Programming Languages: Python and SQL
Location: Remote
Duration: 8+ months
Marketplace Platform Lead
Job Overview
The Marketplace Platform Lead is responsible for driving the end?to?end technical architecture and implementation of the enterprise Data Marketplace platform. This role spans stakeholder engagement, architectural definition, integration design, and hands-on leadership throughout implementation. The ideal candidate is a seasoned technical leader with deep experience designing integration patterns, building scalable platforms, and guiding engineering teams through complex cross-system solutions.
Key Responsibilities
Lead stakeholder meetings to gather business requirements, align on platform objectives, and clarify workflows and user journeys.
Conduct tool evaluations, build scoring frameworks, and make recommendations on platforms, vendors, and integration technologies.
Define end-to-end Marketplace architecture, including data flows, APIs, domain models, integration strategies, and platform components.
Design and lead the implementation of integration patterns, including API-based integrations, event-driven patterns, workflow orchestration, and cross-system interoperability.
Develop technical designs, architectural documents, and standards for Marketplace workflows, user flows, and extensibility patterns.
Provide hands-on architectural guidance to engineering teams throughout solution design, development, and delivery.
Oversee technical quality, scalability, performance, and security across Marketplace components and integrations.
Collaborate with product, engineering, data, and security teams to ensure compliance with enterprise data governance, privacy, and reliability standards.
Lead technical reviews, drive design decisions, and ensure alignment across cross-functional stakeholders.
Required Skills & Qualifications
8+ years of experience in software engineering, platform development, or technical architecture roles.
Strong expertise in designing and implementing integration architectures, including REST/GraphQL APIs, event-driven patterns, synchronous/asynchronous messaging, and workflow engines.
Deep understanding of distributed systems, microservices, and cloud-native solutions (Azure, AWS, or GCP).
Proficiency with API design, messaging systems, and enterprise integration frameworks.
Experience defining technical architecture, data flows, and workflow designs for complex platforms.
Ability to translate business requirements into technical designs, user flows, and actionable engineering plans.
Demonstrated leadership in guiding engineering teams through architectural decisions and implementation.
Strong communication skills with the ability to influence technical and non-technical partners.
Experience evaluating and scoring platforms, tools, or vendor solutions.
Solid knowledge of DevOps practices, CI/CD, infrastructure-as-code, observability, and security best practices.
Preferred Qualifications
Experience building or leading a Data Marketplace platform.
Familiarity with workflow orchestration platforms, rules engines, BPM tools, or catalog management systems.
Experience with enterprise identity systems (OAuth, SAML, SSO), access governance, and data privacy frameworks.
Background working with enterprise data platforms, data governance, or cross-domain integration patterns.
Prior experience leading architectural governance or serving as a platform architect in an enterprise environment.
Duration-: 10+ Months
Location: Remote
Overview
An experienced Solution Architect to lead the enterprise rollout of Microsoft Purview across a complex global, multi cloud environment. The consultant will define architecture, implement domain?based governance, and drive adoption of Purview capabilities including cataloging, lineage, classification, access governance, and compliance controls.
Key Responsibilities
- Architecture & Implementation
- Define target?state architecture for Microsoft Purview across Azure, AWS, M365, on prem, and third party platforms.
- Develop and drive the implementation roadmap across U.S. Businesses, PGIM, Corporate Technology, and international units.
- Establish Purview reference architecture, integration patterns, and guardrails.
- Domain Based Governance
- Design collections, hierarchies, and RBAC aligned to domain structures and legal entity boundaries.
- Enable domain owned stewardship while enforcing enterprise taxonomies and governance standards.
- Platform Configuration
- Configure Data Map, Catalog, Scans, Classifications, Sensitivity Labels, and Lineage.
- Optimize scan strategy (frequency, cost, performance) and extend classifiers and metadata models.
- Security & Compliance
- Integrate Purview with M365 Information Protection, Entra ID, and security baselines.
- Support PII/PCI/PHI detection, access governance, and regulatory compliance (SOX, GLBA, NYDFS, GDPR).
- Engineering & Integration
- Integrate with Synapse, Fabric, Databricks (including Unity Catalog), Snowflake, SQL Server, AWS sources, and SAP/Oracle.
- Implement IaC (Bicep/Terraform), CI/CD for Purview artifacts, and automation via APIs.
- Adoption & Stakeholder Management
- Deliver training, onboarding playbooks, and steward enablement.
- Lead workshops for new data domains and products.
- Provide executive level reporting on progress, risks, and KPIs.
Required Qualifications
- 10+ years in data architecture/governance; 2+ years hands on Purview experience at enterprise scale.
- Strong expertise in metadata management, lineage, classification, scan optimization, glossary management and domain based operating models.
- Solid Azure ecosystem knowledge (Storage, Key Vault, Synapse, Fabric, Databricks), M365 Information Protection, and Entra ID.
- Experience with IaC (Bicep/Terraform), APIs/Atlas, and scripting (PowerShell/Python).
- Financial services or regulated industry exposure.
- Excellent communication, stakeholder leadership, and cross domain facilitation skills.
Duration: 6+ months (CTH)
Location: hybrid (Newark, NJ)
Summary
As a Senior Software Engineer on the Retirement Strategies Technology team, you will partner with product owners, tech leads, designers, engineers and delivery professionals to deliver quality platforms and products with speed.? You will code, test and debug new and existing applications as you implement capabilities to solve sophisticated business problems, deploy innovative products, services and experiences to delight our customers! In addition to advanced technical expertise and experience, you will bring excellent problem solving, communication and teamwork skills, along with agile ways of working, strong business insight, an inclusive leadership attitude and a continuous learning focus to all that you do.
Here is What You Can Expect on a Typical Day
Build applications ensuring that the code follows latest coding practices and industry standards, using modern design patterns and architectural principles; remove technical impediments??
Develop high quality, well documented and efficient code adhering to all applicable Prudential standards??
Collaborate with product owners in understanding needs and defining feature stories, tech leads in defining technical design and other team members to understand the system end-to-end and deliver robust solutions that bring about business impact?
Write unit, integration tests and functional automation, researching problems discovered by quality assurance or product support, developing solutions to address the problems??
Bring a strong understanding of relevant and emerging technologies, provide input and coach team members and embed learning and innovation in the day-to-day??
Work on complex problems in which analysis of situations or data requires an evaluation of intangible variables.
Use programming languages including but not limited to Java, JavaScript, Springboot, Node.js frameworks?
The Skills & Expertise You Bring:
Bachelor of Computer Science or Engineering or experience in related fields
Ability to coach others with minimal guidance and effectively leverage diverse ideas, experiences, thoughts and perspectives to the benefit of the organization??
Experience with agile development methodologies and Test-Driven Development (TDD)
Knowledge of business concepts tools and processes that are needed for making sound decisions in the context of the company's business
Ability to learn new skills and knowledge on an on-going basis through self-initiative and tackling challenges
Excellent problem solving, communication and collaboration skills
Advanced experience and/or expertise with several of the following:
Programming Languages:? Java, Java Script; working in distributed systems, object oriented programming, design patterns and design methodology; JAVA services using Spring,, Microservices, multi-threading, Concurrency and parallel processing
Frameworks:?Springboot, Node.js
Data Store:?NoSQL or Relational Data structures;
Data Streaming:?SQS, SNA
Application Programming Interfaces (API): Consumption & Development; implementing service oriented architecture (SOA) patterns; Web service technologies such as APIs, REST, JSON, SQL
API Management & Integration : Kong, Apigee
Unit, interface and end user testing?concepts and tooling (functional & non-functional)
Automated testing
Accessibility awareness
Software security skills?including?secure coding, web application security and ; Solid grasp of security concepts (authentication, authorization, encryption, digital signature, JWT), SSL, web service proxies, firewall, SAML 2.0, OpenID Connect, OAuth 2.0)
Dev Ops Tools & Practices: Branching techniques and usage of GitHub; DevOps
Software Development Life Cycle (SDLC): Monitoring and logging techniques
AWS Core Services across compute, storage, DB, IAM
Preferred Qualifications:
Strong experience with Domain Driven Development (DDD)
AWS cloud native solution development
Architecture Patterns
Design and critical Thinking
Financial/Insurance industry experience is a must, not a plus
People Leadership Experience is a plus.
Experience with agentic frameworks and AI driven development tools is a major plus [Claude Code, GitHub Copilot etc]
Location: Newark, NJ (Hybrid)
Duration: 12 Months
Role Overview
Client is seeking a Senior Business Analyst to support the Product Enablement and Contract Automation initiative. This role is focused on enabling automated contract generation by establishing accurate, validated, and structured product data that serves as a single source of truth for Group Insurance product offerings.
The Senior Business Analyst works closely with business, product, and technology partners to translate contract and product intent into clear data, mapping, and process requirements that support integration between AWS Cloud, APIs, and SharePoint and OpenText Content Web Document Services (CWDS).
Key Responsibilities
- Partner with Group Insurance business, product, and technology stakeholders to understand contract automation objectives
- Identify, document, and validate field-level data elements required for automated contract generation
- Create data mapping specifications including transformation rules, validation criteria, and business logic
- Leverage AI-assisted tooling to accelerate data discovery, mapping analysis, and documentation
- Facilitate working sessions with business partners to validate data definitions, mappings, and contract logic
- Document end-to-end document generation workflows, including system interactions and exception handling
- Translate validated requirements into consumable artifacts for engineering and quality teams
- Support User Acceptance Testing (UAT) and implementation readiness activities
- Communicate risks, dependencies, and decisions across cross-functional teams
Required Qualifications
- 5+ years of experience as a Business Analyst or Business Systems Analyst
- Strong experience with data mapping, data validation, and integration-driven solutions
- Proven ability to validate requirements and outcomes with business partners
- Strong analytical, facilitation, and communication skills
Preferred Qualifications
- Experience supporting contract automation or document generation initiatives
- Familiarity with AWS Cloud, APIs, and SharePoint, document management, or content services platforms
- Experience leveraging AI tools to support analysis and requirements documentation
My client is looking for a Senior R&D Project Manager to work onsite in their Parsippany NJ office.
This is an exciting role who will be responsible for the successful execution of product development projects. You will plan, coordinate and lead the execution of activities to ensure that the goals and objectives of the project are accomplished within the prescribed timeframe and funding parameters. This is a technical position and the candidate must have an engineering background in order to manage and contribute to the development of new products. The projects that will be managed include a mix of new product development, product line extensions, and sustaining engineering releases.
Essential Functions
- Must have the ability to implement multiple projects simultaneously outside technical area of expertise.
- Ability to balance electrical, mechanical, and software development issues at the system level
- Lead the execution of assigned product development programs in accordance with established processes and procedures.
- Lead and motivate cross-functional team performance toward the goal of completing projects according to the defined objectives.
- Develop detailed project work plans and schedules.
- Manage product requirements and traceability.
- Lead design review and risk management activities.
- Manage technical partners/ vendors supporting product development activities.
- Effectively utilize problem solving skills and techniques to identify potential issues, assess their impact, and develop and implement mitigation and resolution plans and activities.
- Employ excellent interpersonal, communication and negotiation skills with all levels of personnel and management.
- Prepare and/or manage the preparation of all required project documentation.
- Facilitate and coordinate project team meetings and management presentations as required.
Required/Preferred Education and Experience
- BS degree in Engineering required.
- Advanced degree preferred.
- 5+ years managing technical product development.
- Experience with medical device capital equipment development.
- 10+ years of experience as an engineer developing products, preferably in the medical device industry.
- PMP certification desired.
Knowledge, Skills and Abilities
- Knowledge of global standards and regulations for Design Controls, Risk Management, and Electrical Safety for Medical Devices.
- Demonstrated aptitude for successfully managing multiple projects, of varying complexity, within the specified guidelines, timeframes and budgets.
- Demonstrated understanding of electrical, mechanical, and software engineering practices at the system level.
- Experience with Scrum and Agile processes.
- Knowledge of fluid mechanics or past experience with ventilators/aspirators a plus.
The annual salary for this position is $150K-$160K. This position is eligible for an annual bonus in accordance with the company’s bonus plans. Benefits include medical, dental, vision, 401K, etc.
Location: Remote
Duration: 6 months
Role Overview
The Integration Architect defines, designs, and governs enterprise integration architecture standards across AWS, Azure, Microsoft Fabric, and on-prem systems. This consultant creates scalable integration blueprints, reusable patterns, and secure connectivity frameworks that ensure interoperability, reliability, and domain-aligned data exchange. The role partners closely with domain teams, platform engineering, API management teams, and enterprise architecture to accelerate delivery while maintaining architectural integrity.
Key Responsibilities
Integration Standards & Governance
- Define and maintain enterprise standards for API design, event schemas, messaging patterns, and integration of contracts.
- Establish integration governance across AWS, Azure, MS Fabric, and on-prem systems.
- Define patterns for ADS (Authorized Data Sources) alignment, data contracts, schema evolution, and anchor key management.
- Enforce adherence to enterprise security principles, including OAuth2/OIDC, JWT, TLS, Zero Trust patterns.
Blueprints & Reference Architecture
- Build and maintain unified enterprise integration architecture blueprints spanning cloud, Fabric, and on prem connectivity.
- Create domain specific and cross domain integration flow maps, canonical API patterns, and event driven reference architectures.
- Align AWS, Azure, MS Fabric, and on-prem patterns under Unified Architecture.
Reusable Patterns & Engineering Enablement
- Develop reusable integration patterns for:
- AWS: API Gateway, Event Bridge, SNS/SQS, Lambda, Step Functions, Glue, EMR, Redshift, Lake Formation, Kinesis, AWS Batch, AWS ECR, AWS ECS Fargate.
- Azure: APIM, Functions, Service Bus, Azure Data Factory (all IR types), Azure Synapse Pipelines, Azure Stream Analytics, Azure Batch, Azure Data Explorer ingestion.
- MS Fabric: Data Factory pipelines, Lakehouse ingestion interfaces, Fabric Data Pipelines, Notebook-based ETL, Warehouse ingestion.
- On prem: MFT, MQ, legacy services.
- Provide templates for API contracts, event schemas, integration error handling, observability hooks, and resiliency patterns.
Metadata, ADS, & Anchor Key Integration
- Define integration patterns incorporating ADS rules, domain ownership, and anchor key management for interoperability.
- Ensure all integration patterns embed security, observability, lineage awareness, and operational resiliency.
- Collaborate with data governance to ensure consistent entity resolution and cross?domain identifier mapping.
Domain Engagement & Architecture Review
- Guide domain teams in implementing target state integration architectures.
- Lead or participate in architecture reviews for API designs, event models, platform integrations, and connectivity.
- Recommend modernization opportunities to retire from legacy integration mechanisms and adopt event-driven/API?first models.
Qualifications
Technical Expertise
- 8-12+ years in integration architecture, API engineering, event-driven design, or hybrid integration.
- Strong hands-on expertise across:
- AWS: API Gateway, Event Bridge, SNS/SQS, Lambda, Step Functions, Glue, EMR, Redshift, Lake Formation, Kinesis, AWS Batch, AWS ECR, AWS ECS Fargate.
- Azure: APIM, Functions, Service Bus, Azure Data Factory (all IR types), Azure Synapse Pipelines, Azure Stream Analytics, Azure Batch, Azure Data Explorer ingestion.
- MS Fabric: Data Factory pipelines, Lakehouse ingestion interfaces, Fabric Data Pipelines, Notebook-based ETL, Warehouse ingestion.
- RDBMS: SQL, Oracle, DB2, RDS, etc.
- On prem: MQ, MFT, REST/SOAP services.
- Understanding of ADS, anchor key management, data/domain contracts, lineage aware integration.
- Experience designing event driven, API first, batch, and hybrid integration architectures.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.
A little about us...
Role: Java Backend Developer
Location: Berkeley heights, NJ
Job Description:
VisionNext/VisionPlus/Cards or Payments Experience is needed
Java, Spring Boot Microservices with Knowledge on AWS
Key Responsibilities
• Design, develop, and optimize backend services for card payments and transaction systems, ensuring low latency, fault tolerance, and multi-region resiliency.
• Build high-throughput APIs and microservices using modern Java frameworks (Spring Boot, Reactor).
• Collaborate closely with product, architecture, and SRE teams to evolve Vision Next / VisionPLUS services for cloud-native, real-time scalability.
• Use AWS services (ECS, Lambda, RDS, ) to architect resilient, secure, and observable applications.
• Write efficient algorithms for transaction routing, settlement, reconciliation, or fraud-detection modules.
• Contribute to system design sessions and architecture decisions, applying deep reasoning to scalability trade-offs, consistency models, and data partitioning.
• Evaluate and optimize application throughput, concurrency handling, and API lifecycle management across multi-region clusters.
• Implement DevOps and CI/CD automation for build, test, and deployment pipelines (GitHub Actions, Jenkins, or CodePipeline).
• Mentor junior engineers, conduct code reviews, and drive engineering excellence through reusable design patterns.
Required Qualifications
• Bachelor’s or master’s degree in computer science or related field.
• 7+ years of backend engineering experience in payments, fintech, or high-transaction enterprise systems.
• Strong proficiency with Java / Spring Boot, data structures, algorithms, and system-level design principles.
• Solid understanding of AWS core services and architectural best practices for scalable distributed systems.
• Experience with multi-region, active-active, or near-real-time architectures for payment or settlement systems.
• Deep debugging, profiling, and performance optimization skills in concurrent, distributed environments.
• Strong analytical reasoning and data-driven problem-solving mindset.
Preferred Qualifications
• Expertise in Python programming for backend development and automation.
• Experience with Vision Next or Vision PLUS modules (CMS, ASM, or TRAMS) or other card processor platforms.
• Familiarity with payment rails (Visa, Mastercard, RTP, ACH) and transaction lifecycle management.
• Knowledge of Kafka, Redis, or Aerospike for event-driven processing and caching.
• Exposure to container orchestration (ECS, EKS, or Kubernetes) and observability platforms (Grafana, Datadog, or OpenTelemetry).
• Understanding of PCI-DSS, data encryption, and regulated financial data operations.
LTIMindtree is an equal opportunity employer that is committed to diversity in the workplace. Our employment decisions are made without regard to race, color, creed, religion, sex (including pregnancy, childbirth or related medical conditions), gender identity or expression, national origin, ancestry, age, family-care status, veteran status, marital status, civil union status, domestic partnership status, military service, handicap or disability or history of handicap or disability, genetic information, atypical hereditary cellular or blood trait, union affiliation, affectional or sexual orientation or preference, or any other characteristic protected by applicable federal, state, or local law, except where such considerations are bona fide occupational qualifications permitted by law.