Simplicity Patterns Jobs in Usa

1,351 positions found — Page 62

Cost & Margin Analyst
✦ New
Salary not disclosed
Katy, Texas 8 hours ago

About Momentara

Momentara is headquartered in Katy, Texas and partners with leading brands to support marketing programs across retail, events, and out-of-home environments. Our team is committed to delivering high-quality work through collaboration, accountability, and operational excellence.

We are currently seeking an Cost & Margin Analyst to support financial and operational analysis across production and job costing activities. This role will work closely with Finance and Operations leadership to analyze cost performance, identify variances, and help strengthen pricing accuracy, operational efficiency, and overall profitability.

JOB SUMMARY:

The Cost & Margin Analyst is responsible for analyzing job-level financial performance by

comparing quoted estimates to actual production results. This role identifies cost variances, margin leakage, process inefficiencies, and pricing inaccuracies todrive improved profitability and operational discipline.

Backup Coverage:

The Cost & Margin Analyst provides backup support for other analytical and reporting functions within Finance and Operations as needed.

ESSENTIAL DUTIES AND RESPONSIBILITIES:

Core duties and responsibilities include the following. Other duties may be assigned.

  • Compare estimated vs. actual costs for labor, materials, freight, outside services, and overhead.
  • Analyze gross margin performance by job, customer, department, and product line.
  • Identify recurring variance patterns and root causes.
  • Provide weekly and monthly profitability reporting to Operations and Finance leadership.
  • Investigate significant cost overruns and underperforming jobs.
  • Partner with Estimating, Production, Scheduling, Purchasing, and Fulfillment to determine root causes.
  • Recommend adjustments to labor standards, run speeds, spoilage assumptions, and cost models.
  • Assist in refining estimating templates and pricing models.
  • Develop KPI dashboards for job profitability, labor efficiency, spoilage rates, setup times, and overtime impact.
  • Present findings and recommendations to Operations and Finance leadership.
  • Support Lean and continuous improvement initiatives.
  • Ensure accurate data flow between ERP systems and financial reporting.
  • Audit job closing procedures to confirm accurate cost capture.

REQUIRED SKILLS/EXPERIENCE:

  • Bachelor's degree in Finance, Accounting, Business, or Operations preferred.
  • 5–7 years of experience in manufacturing cost analysis, job costing, or financial analysis.
  • Experience in print manufacturing, packaging, or production-based environments strongly preferred.
  • Advanced Excel skills including formulas, pivot tables, and data modeling.
  • Experience with ERP/MIS systems (e.g., Pace, EFI, Monarch, SAP).
  • Strong analytical, problem-solving, and communication skills.

SUPERVISORY RESPONSIBILITIES:

This position does not have direct supervisory responsibility but works closely with cross-functional teams including Estimating, Production, and Finance.

PHYSICAL DEMANDS:

The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

  • Regular sitting, frequent use of hands for typing, and occasional standing or walking.
  • Specific vision abilities required include close vision, and the ability to adjust focus.
  • The employee must occasionally lift up to 10 pounds.

WORK ENVIRONMENT:

The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

  • The noise level in the work environment is usually moderate.
Not Specified
Content Management System Developer
✦ New
🏢 Unisys
Salary not disclosed
Plano, Texas 8 hours ago
  • We are seeking a seasoned professional with deep expertise in OpenText Documentum and Enterprise Content Management (ECM) systems.

The ideal candidate will have the following qualifications:

Documentum Expertise:

  • 7+ years of experience in designing, developing, and troubleshooting Documentum applications (Content Server, D2 Config, D2 Classic, D2 Smart View, Brava, CTS, xPlore, DFC Client).
  • Strong understanding of Documentum architecture, object model, and security.
  • Experience with platform upgrades and migrations.

API Development:

  • 2+ years of experience implementing REST APIs using Spring Framework, preferably for Documentum or similar ECM platforms.

Programming & Tools:

  • 5+ years of experience with Java and/or Python.
  • Proficient in tools and frameworks such as Git, Jenkins, Jira, IntelliJ, Tomcat/J2EE, and relational databases.
  • Solid understanding of design patterns and software architecture.

Software Development Practices:

  • Familiarity with unit testing, code coverage, deployment processes, vulnerability management, and system monitoring.

Cloud Experience (AWS):

  • 3+ years of experience working with AWS services including EC2, ECS, RDS, ALB, SSM, SQS, SNS, Lambda, and AWS SDK.
  • Skilled in troubleshooting AWS deployments, reviewing configurations, security policies, and analyzing logs.

Agile Methodology:

  • Strong understanding of Agile Scrum practices such as sprint planning, backlog refinement, daily stand-ups, and retrospectives.

Collaboration & Communication:

  • Ability to work effectively with cross-functional teams and communicate technical concepts clearly.
  • Self-Management.
  • Capable of managing time, tasks, and priorities independently.

Certifications:

  • Preferred Certifications include AWS, Documentum, and Java.

#LI-CGTS

#TS-3142

Not Specified
Cloud Engineer (AWS Neptune)
✦ New
Salary not disclosed
San Antonio, Texas 8 hours ago

Role: AWS Cloud Engineer

Location: San Antonio, TX (On-Site)

• Identify and select the data sources, Tables, Relationships between different entities in Snowflake.

• Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.

• Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.

• Load the extracted data into AWS S3 for graph model transformation.

• Analyze the data, relationships, entities mappings to determine the necessary graph schema.

• Design nodes, edges, and properties for the graph schema using entities definitions.

• Implement the graph schema in AWS Neptune.

• Creating indexes to improve query performance

• Review and refine schema based on query patterns and performance based on mutually agreed design

• Define the transformation logic using DBT.

• Develop DBT models for data transformation.

• Schedule and automate the ELT pipeline using DBT and Snowflake.

Not Specified
Logistics risk analyst
✦ New
Salary not disclosed
Santa Ana, California 8 hours ago

Data Analyst 1 (Logistics Risk Analyst)

Santa Ana, CA(Fully onsite)

12 months

**Bilingual Korean not required but strongly preferred

Role Summary

  • Logistics Risk Analyst focuses on data-driven identification, analysis, and monitoring of logistics risks. This role serves as the analytical engine of the logistics risk management function, transforming operational data into actionable insights.

Key Responsibilities

  • Analyze logistics data from TMS, WMS, ERP, and claims systems to identify risk patterns and anomalies.
  • Develop and maintain logistics risk dashboards, KPIs, and early-warning indicators.
  • Track trends in loss, damage, delay, and service failures.
  • Support root cause analysis with quantitative evidence and data modeling.
  • Prepare regular risk reports and ad-hoc analyses for management review.
  • Partner with IT and operations teams to improve data quality and system integration.
  • Support audits, compliance reviews, and risk assessments with data analysis.
  • Recommend process improvements based on data findings.

Qualifications:

  • Bachelor's degree in Data Analytics, Supply Chain, Engineering, Statistics, or related field.
  • 2 years of experience in logistics analytics, operations analytics, or supply chain data analysis.
  • Hands-on experience with Excel, SQL, and BI tools (Power BI, Tableau, etc.).
  • Familiarity with TMS/WMS and logistics performance metrics.
  • Experience handling large datasets and building dashboards.

Core Competencies

  • Strong analytical and quantitative skills
  • High attention to detail and data accuracy
  • Ability to translate data into business insights
  • Structured analysis.

Education and Years of Experience:

  • Bachelor's degree in Data Analytics, Supply Chain, Engineering, Statistics, or related field.
  • 2 years of experience in logistics analytics, operations analytics, or supply chain data analysis.

Top Skills:

  • Excel(Pivot, Lookup, data matching & compare),
  • Communication
  • Analysis
Not Specified
Platform Security Architect
✦ New
Salary not disclosed
Dallas, Texas 8 hours ago

We are seeking a Platform Security Architect to help secure customer-facing platforms across web, mobile, and API environments within an AWS ecosystem. This role partners closely with software engineering and architecture teams to ensure secure-by-design solutions for modern cloud applications.

Key Responsibilities

  • Lead security architecture reviews for web, mobile, and API-driven platforms
  • Provide hands-on guidance to engineering teams on secure design patterns
  • Define and implement security guardrails across AWS cloud environments
  • Support secure authentication and authorization frameworks (OAuth2, OIDC, SAML)
  • Ensure secure integration across microservices, APIs, and CI/CD pipelines

Required Experience

  • 7+ years of IT experience with a focus on cloud or application security
  • Background in software engineering, platform engineering, or cloud architecture
  • Strong experience securing API-driven applications and microservices
  • Hands-on AWS cloud security (IAM, encryption, networking, monitoring)
  • Experience with OAuth2, OpenID Connect, SAML, or CIAM architectures

Preferred

  • Experience supporting customer-facing platforms or payment environments
  • Familiarity with PCI DSS and modern security frameworks
  • AWS or security-related certifications

This role is ideal for a software engineer or cloud architect who transitioned into security, with experience securing modern API-based platforms in AWS.

Not Specified
Software Developer
✦ New
Salary not disclosed
Lansing, Michigan 8 hours ago

Position: Software Developer

Duration: 12-month contract plus possible extension

Location: Lansing, Michigan 2 days per week onsite (Monday & Tuesday)

Pay Range: $45/hr - $50/hr

**Exact compensation may vary based on several factors, including skills, experience, and education.

**Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.

Description:

The position is responsible for providing ongoing maintenance and support of complex Java applications and developing enhancements to applications supported within our department. The resource is integral to developing and maintaining automated processes, streamlining critical business processes, data integrity, SEM/SUITE compliance, and securing the applications. Not having a resource on staff will lead to the organization manually documenting and developing screen plans that can lead to errors causing data integrity issues and can eventually lead to incorrect information being processed and reported.

• Write well-designed, testable code using spring MVC, Hibernate framework for

entity object mapping, jQuery/HTML5, JavaScript, HTML, XML, Angular.

• Develop business application components using Object Oriented java/JEE technologies, design principles.

• Design and develop RESTful Web Services using Spring Web MVC framework.

• Design, Develop and maintain applications using Apache Struts framework.

• Ability to implement design patterns like Intercepting Filter, Front Controller, Session Façade, DAO, Singleton, and Service Locator

• Proficient in building and maintaining unit test framework with Junit and Spring Boot.

• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.

• Develop database objects including stored procedures, functions,

• Troubleshoot issues using SQL, PL/SQL scripts.

• Experience tuning SQL queries and scripts

• Hands-on experience Integrating software components into a fully functional software system.

• Extensive knowledge on source control tools such as GIT

• Experience on Build Frameworks like Maven, maintain source code using source control, and baseline software versions and building war files for deployments

• Experience developing with application servers Apache Tomcat, JBOSS, WebSphere, and OpenShift Container.

• Develop software design documents and work with stakeholders for review and approval.

• Experience developing, prototypes, mockup for users review and approval.

• Experience creating flowcharts, screen layouts and documentation to ensure logical flow of the system requirements.

• Experience with React.js and modern JavaScript (ES6+).

• Understanding of Redux, React Router, and component lifecycle.

• Experience working on large agile projects.

• Experience with Java 17+, SOAP Web-Services and Java Messaging Service(JMS) API.

• Experience with Spring Boot Projects, Spring Data, Spring Batch, Spring Security frameworks

Qualifications:

  • 5+ years developing complex computer systems using Java.
  • 5+ years developing complex computer systems using java IDEs such as Eclipse and STS.
  • 5+ years programming using Java JEE Struts Framework.
  • 5+ years programming in SQL and/or PL/SQL.
  • 5+ years programming using Java JEE Spring/SpringBoot Framework 3.0.
  • 5+ years of development using Hibernate/JPA framework.
  • 3+ years in projects development using Angular/React JS, JavaScript framework.
  • 3+ years programming in the JBOSS Enterprise SOA environment including JBOSS Workflow.
  • 3+ years using CMM/CMMI Level 3 methods and practices.
  • 2+ years implemented agile development processes including test driven development.
  • 2+ years of experience with React.js and modern JavaScript (ES6+).
  • Understanding of Redux, React Router, and component lifecycle.
  • Exposure to DevOps practices and cloud platforms (AWS, Azure).
  • Hands-on experience using AI to accelerate daily coding tasks, including code generation, refactoring and documentation.
Not Specified
IT Software Developer Sr./Go Developer
✦ New
Salary not disclosed
Columbus, Ohio 8 hours ago

Immediate need for a talented IT Software Developer Sr./Go Developer . This is a 12+ Months Contract opportunity with long-term potential and is located in Columbus, OH (Remote). Please review the job description below and contact me ASAP if you are interested.

Job ID:26-08816

Pay Range: $67 - $68.57/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).

Key Responsibilities:

  • Participate in work planning and management activities as part of a team utilizing agile methodology (work items managed in GitHub, team stand-up meetings, and other sprint ceremonies).
  • Document and track assignments according to team standards and using team tools.
  • Perform daily application development activities (designing solution, programming, testing, collaborating with peers etc.)

Key Requirements and Technology Experience:

  • Must have skills: - Golang, AWS, Monitoring and Observability Tools, CI/CD, AWS RDS, PostgreSQL, DynamoDB, Blue/Green, Canary.
  • Advanced skills in Go programming.
  • Expertise in public cloud and associated toolsets (AWS, Azure, Terraform, Kubernetes/OpenShift)
  • Expertise in CI/CD and release engineering tooling (Azure DevOps, GitHub Actions) and deployment patterns (blue/green, canary)
  • Experience with monitoring and observability tools (Splunk, App Dynamics, Open Telemetry) and a working knowledge of networking and security
  • Database expertise (AWS RDS, PostgreSQL, DynamoDB)
  • Ability to hit the ground running while working independently and in a collaborative manner by taking full ownership.
  • Demonstrated ability to leverage AI tools effectively without over-reliance, maintaining sound independent judgment and critical thinking.
  • Hands on coding and development position.
  • Comfortability navigating the unknown in a small, self-directed agile team is a plus.
  • Good communication skills with ability to translate technical and non-technical manner a plus.

Our client is a leading Insurance Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.

Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.

Not Specified
Cyber Security Engineer
✦ New
🏢 CBTS
Salary not disclosed
Atlanta, Georgia 8 hours ago

Role: Cybersecurity Engineer

Location: Atlanta, GA (Hybrid)

Duration: 6 Months

Job Responsibilities / Typical Day in the Role

  • Implement and maintain WAF protections across web/API properties.
  • Write and tune WAF rules (e.g., custom rules, bot controls, rate limits).
  • Analyze logs/alerts to identify malicious patterns and false positives; adjust policies.
  • Collaborate with product/engineering to integrate WAF in the SDLC and CI/CD.
  • Build automation and capabilities via code to support WAF program
  • Document runbooks, change procedures, and playbooks for common scenarios.
  • Participate in on call rotation

Must Have Skills / Requirements

Experience with Cloud Environments

  • 2+ years of experience

Scripting Experience

  • 2+ years of experience; Python/Bash/Go Lang – at least one of these

IaC Experience

  • 2+ years of experience working with infrastructure as code.

Nice to Have Skills / Preferred Requirements

  • Terraform/CloudFormation for WAF config as code; CDN integrations.
  • Security certs (e.g., GIAC/GWAPT, CISSP/CSSLP) are a plus but not required.
  • Oracle Cloud or GCP experience.
Not Specified
Data Engineer - GCP
✦ New
Salary not disclosed
Phoenix, Arizona 8 hours ago

Job Summary

We are seeking a skilled Data Engineer with 5+ years of hands-on experience designing, building, and maintaining scalable data pipelines and data platforms. The ideal candidate has strong experience working with DAG-based orchestration, cloud technologies (preferably Google Cloud Platform), SQL-driven data processing, Apache Spark, and Python-based API development using Fast API. You will play a key role in enabling reliable data ingestion, transformation, and quality assurance across enterprise systems.

Key Responsibilities

  • Design, develop, and maintain DAG-based data pipelines (Airflow or similar orchestration tools).
  • Build and optimize SQL-based data transformations for analytics and reporting.
  • Develop and manage batch and streaming data pipelines using Apache Spark.
  • Implement Python-based REST APIs using FastAPI for data services and integrations.
  • Perform data quality checks, validation, reconciliation, and anomaly detection.
  • Work with cloud platforms (preferably Google Cloud Platform) for storage, compute, and orchestration.
  • Architect and implement cloud-native data platforms on GCP, leveraging services such as BigQuery, BigTable, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
  • Monitor pipeline performance, troubleshoot failures, and optimize processing efficiency.
  • Collaborate with analytics, application, and business teams to understand data requirements.
  • Ensure best practices around security, scalability, and maintainability.
  • Ensure data quality, reliability, security, governance, and compliance with enterprise standards

Required Skills & Experience

  • 5 + years of experience as a Data Engineer
  • Strong experience with DAG orchestration (e.g., Apache Airflow).
  • Solid understanding of cloud technologies, preferably Google Cloud Platform (GCP).
  • Advanced proficiency in SQL for data processing and transformations.
  • Hands-on experience running and tuning Apache Spark jobs.
  • Experience developing APIs using Python and FastAPI.
  • Strong understanding of data quality frameworks, checks, and validation techniques.
  • Proficiency in Python, Java, Scala, or PySpark, with strong SQL expertise.
  • Hands-on experience with GCP data services, including BigQuery, BigTable, Dataproc, Dataflow, and cloud-native ETL patterns.
  • Experience with software delivery methodologies such as Agile, Scrum, and CI/CD practices.
  • Strong analytical and problem-solving skills.
  • Ability to work independently and in cross-functional teams.
  • Good communication and documentation skills.
Not Specified
Technical Specialist 4
✦ New
Salary not disclosed
Columbus, Ohio 8 hours ago

Position Title: Technical Specialist 4 (TS4) – Data Engineering

Location: 30 E. Broad St. Columbus, OH 43215

Mode: Remote (Report 1st day to office)

Interview: Virtual

Clearance Requirements: None

Position Status: Contract (12 Months)

Position Description

We are seeking an experienced Technical Specialist 4 (TS4) to design and implement data engineering solutions that advance the agency's data ecosystem. This senior-level role requires a strong background in data integration, data modeling, and enterprise data architecture. You will lead the evaluation and selection of data platforms, assist with data storage solutions, and shape scalable, secure, and well-governed data solutions for the agency's analytics and operational systems.

As a key technical contributor, you will collaborate with IT Architecture teams and senior leadership to ensure data solutions align with business needs, enterprise standards, and long-term goals. Your leadership and technical expertise will directly impact the evolution of data systems, from integration to ongoing optimization and maintenance.

Key Responsibilities

  • Design & Maintain Data Models: Create and support conceptual, logical, and physical data models for enterprise analytics and operational systems.
  • Data Governance & Standards: Establish data modeling standards, naming conventions, and design patterns to ensure consistency across all data platforms.
  • Enterprise Data Architecture: Contribute to the development and evolution of the agency's enterprise data architecture roadmap, aligning with long-term goals and standards.
  • Data Integration & Solutions: Evaluate and implement scalable data integration solutions, ensuring interoperability and alignment with enterprise integration strategies.
  • Technical Leadership: Lead technical discussions related to data system design, implementation, optimization, and maintenance, guiding the Data Management team on best practices.
  • Collaboration: Work closely with internal teams and enterprise partners to configure integrations between agency systems and external data platforms such as data lakes and data quality platforms.

Required Skills/Education

  • Experience: Minimum 5 years of hands-on experience in data integration, data cleansing, data modeling, and data classification.
  • Skills:
  • Proficient in designing and maintaining data models supporting enterprise-level analytics and operational systems.
  • Expertise in data integration and ensuring data governance across various data platforms.
  • Strong technical leadership and the ability to guide teams through complex technical decisions.
  • Experience with enterprise data architecture, data lakes, and integration with third-party platforms.
  • Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • Certifications: Relevant certifications in data architecture, enterprise architecture, or similar fields are a plus.
Not Specified
jobs by JobLookup
✓ All jobs loaded