Pennant Flag Pattern Jobs in Usa
1,517 positions found — Page 85
Role : Embedded Engineer
Location : Must work onsite 4 days per week in Boise ID or Dallas TX 4 days onsite, 1 day remote - No Fri or Mon Remote allowed
Travel Requirements:
Up To 10% overnight travel
As the R&D Senior Software Engineer, you are an expert in Linux development using both C++ and Python. You have experience designing production ready software and understand the patterns and architectures used to create reliable and maintainable codebases. You have experience with common robotics software tools and concepts such as ROS, motor control, localization, navigation, and sensor hardware technology such as LiDAR, Cameras, Motors, Encoders, etc. Most importantly, you have experience leading teams, driving processes, inspiring software engineers, and creating strong team cultures with open communication.
Qualifications
· Bachelors Degree in Computer Science, Computer Engineering, or similar field
· Minimum 8+ years of relevant experience
· Experience designing and implementing production software systems.
· Experience with Agile development environment and a strong drive for SOPs.
· Expert in Linux environment and developing for Linux systems.
· Expertise in C/C++ and Python, Experience with other software technologies (Web
· Development, SQL, etc.).
· Experience with CI/CD and testing methodologies and implementation.
· Experience with Containerization and Deployment Strategies.
· Experience with version control systems, GIT preferred.
· Experience with ROS (Robot Operating System).
Thanks and Regards
Ashish Tripathi || US IT RecruiterKPG99,INC
|
About Momentara
Momentara is headquartered in Katy, Texas and partners with leading brands to support marketing programs across retail, events, and out-of-home environments. Our team is committed to delivering high-quality work through collaboration, accountability, and operational excellence.
We are currently seeking an Cost & Margin Analyst to support financial and operational analysis across production and job costing activities. This role will work closely with Finance and Operations leadership to analyze cost performance, identify variances, and help strengthen pricing accuracy, operational efficiency, and overall profitability.
JOB SUMMARY:
The Cost & Margin Analyst is responsible for analyzing job-level financial performance by
comparing quoted estimates to actual production results. This role identifies cost variances, margin leakage, process inefficiencies, and pricing inaccuracies todrive improved profitability and operational discipline.
Backup Coverage:
The Cost & Margin Analyst provides backup support for other analytical and reporting functions within Finance and Operations as needed.
ESSENTIAL DUTIES AND RESPONSIBILITIES:
Core duties and responsibilities include the following. Other duties may be assigned.
- Compare estimated vs. actual costs for labor, materials, freight, outside services, and overhead.
- Analyze gross margin performance by job, customer, department, and product line.
- Identify recurring variance patterns and root causes.
- Provide weekly and monthly profitability reporting to Operations and Finance leadership.
- Investigate significant cost overruns and underperforming jobs.
- Partner with Estimating, Production, Scheduling, Purchasing, and Fulfillment to determine root causes.
- Recommend adjustments to labor standards, run speeds, spoilage assumptions, and cost models.
- Assist in refining estimating templates and pricing models.
- Develop KPI dashboards for job profitability, labor efficiency, spoilage rates, setup times, and overtime impact.
- Present findings and recommendations to Operations and Finance leadership.
- Support Lean and continuous improvement initiatives.
- Ensure accurate data flow between ERP systems and financial reporting.
- Audit job closing procedures to confirm accurate cost capture.
REQUIRED SKILLS/EXPERIENCE:
- Bachelor's degree in Finance, Accounting, Business, or Operations preferred.
- 5–7 years of experience in manufacturing cost analysis, job costing, or financial analysis.
- Experience in print manufacturing, packaging, or production-based environments strongly preferred.
- Advanced Excel skills including formulas, pivot tables, and data modeling.
- Experience with ERP/MIS systems (e.g., Pace, EFI, Monarch, SAP).
- Strong analytical, problem-solving, and communication skills.
SUPERVISORY RESPONSIBILITIES:
This position does not have direct supervisory responsibility but works closely with cross-functional teams including Estimating, Production, and Finance.
PHYSICAL DEMANDS:
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
- Regular sitting, frequent use of hands for typing, and occasional standing or walking.
- Specific vision abilities required include close vision, and the ability to adjust focus.
- The employee must occasionally lift up to 10 pounds.
WORK ENVIRONMENT:
The work environment characteristics described here are representative of those an employee encounters while performing the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
- The noise level in the work environment is usually moderate.
- We are seeking a seasoned professional with deep expertise in OpenText Documentum and Enterprise Content Management (ECM) systems.
The ideal candidate will have the following qualifications:
Documentum Expertise:
- 7+ years of experience in designing, developing, and troubleshooting Documentum applications (Content Server, D2 Config, D2 Classic, D2 Smart View, Brava, CTS, xPlore, DFC Client).
- Strong understanding of Documentum architecture, object model, and security.
- Experience with platform upgrades and migrations.
API Development:
- 2+ years of experience implementing REST APIs using Spring Framework, preferably for Documentum or similar ECM platforms.
Programming & Tools:
- 5+ years of experience with Java and/or Python.
- Proficient in tools and frameworks such as Git, Jenkins, Jira, IntelliJ, Tomcat/J2EE, and relational databases.
- Solid understanding of design patterns and software architecture.
Software Development Practices:
- Familiarity with unit testing, code coverage, deployment processes, vulnerability management, and system monitoring.
Cloud Experience (AWS):
- 3+ years of experience working with AWS services including EC2, ECS, RDS, ALB, SSM, SQS, SNS, Lambda, and AWS SDK.
- Skilled in troubleshooting AWS deployments, reviewing configurations, security policies, and analyzing logs.
Agile Methodology:
- Strong understanding of Agile Scrum practices such as sprint planning, backlog refinement, daily stand-ups, and retrospectives.
Collaboration & Communication:
- Ability to work effectively with cross-functional teams and communicate technical concepts clearly.
- Self-Management.
- Capable of managing time, tasks, and priorities independently.
Certifications:
- Preferred Certifications include AWS, Documentum, and Java.
#LI-CGTS
#TS-3142
Role: AWS Cloud Engineer
Location: San Antonio, TX (On-Site)
• Identify and select the data sources, Tables, Relationships between different entities in Snowflake.
• Establish a secure connection between Snowflake and AWS Neptune via AWS service stack by securing extracted data storing in S3 for transformation before loading into AWS Neptune.
• Collaborate with current platform team to understanding data structure to implement data extraction processes from Snowflake.
• Load the extracted data into AWS S3 for graph model transformation.
• Analyze the data, relationships, entities mappings to determine the necessary graph schema.
• Design nodes, edges, and properties for the graph schema using entities definitions.
• Implement the graph schema in AWS Neptune.
• Creating indexes to improve query performance
• Review and refine schema based on query patterns and performance based on mutually agreed design
• Define the transformation logic using DBT.
• Develop DBT models for data transformation.
• Schedule and automate the ELT pipeline using DBT and Snowflake.
Data Analyst 1 (Logistics Risk Analyst)
Santa Ana, CA(Fully onsite)
12 months
**Bilingual Korean not required but strongly preferred
Role Summary
- Logistics Risk Analyst focuses on data-driven identification, analysis, and monitoring of logistics risks. This role serves as the analytical engine of the logistics risk management function, transforming operational data into actionable insights.
Key Responsibilities
- Analyze logistics data from TMS, WMS, ERP, and claims systems to identify risk patterns and anomalies.
- Develop and maintain logistics risk dashboards, KPIs, and early-warning indicators.
- Track trends in loss, damage, delay, and service failures.
- Support root cause analysis with quantitative evidence and data modeling.
- Prepare regular risk reports and ad-hoc analyses for management review.
- Partner with IT and operations teams to improve data quality and system integration.
- Support audits, compliance reviews, and risk assessments with data analysis.
- Recommend process improvements based on data findings.
Qualifications:
- Bachelor's degree in Data Analytics, Supply Chain, Engineering, Statistics, or related field.
- 2 years of experience in logistics analytics, operations analytics, or supply chain data analysis.
- Hands-on experience with Excel, SQL, and BI tools (Power BI, Tableau, etc.).
- Familiarity with TMS/WMS and logistics performance metrics.
- Experience handling large datasets and building dashboards.
Core Competencies
- Strong analytical and quantitative skills
- High attention to detail and data accuracy
- Ability to translate data into business insights
- Structured analysis.
Education and Years of Experience:
- Bachelor's degree in Data Analytics, Supply Chain, Engineering, Statistics, or related field.
- 2 years of experience in logistics analytics, operations analytics, or supply chain data analysis.
Top Skills:
- Excel(Pivot, Lookup, data matching & compare),
- Communication
- Analysis
We are seeking a Platform Security Architect to help secure customer-facing platforms across web, mobile, and API environments within an AWS ecosystem. This role partners closely with software engineering and architecture teams to ensure secure-by-design solutions for modern cloud applications.
Key Responsibilities
- Lead security architecture reviews for web, mobile, and API-driven platforms
- Provide hands-on guidance to engineering teams on secure design patterns
- Define and implement security guardrails across AWS cloud environments
- Support secure authentication and authorization frameworks (OAuth2, OIDC, SAML)
- Ensure secure integration across microservices, APIs, and CI/CD pipelines
Required Experience
- 7+ years of IT experience with a focus on cloud or application security
- Background in software engineering, platform engineering, or cloud architecture
- Strong experience securing API-driven applications and microservices
- Hands-on AWS cloud security (IAM, encryption, networking, monitoring)
- Experience with OAuth2, OpenID Connect, SAML, or CIAM architectures
Preferred
- Experience supporting customer-facing platforms or payment environments
- Familiarity with PCI DSS and modern security frameworks
- AWS or security-related certifications
This role is ideal for a software engineer or cloud architect who transitioned into security, with experience securing modern API-based platforms in AWS.
Position: Software Developer
Duration: 12-month contract plus possible extension
Location: Lansing, Michigan 2 days per week onsite (Monday & Tuesday)
Pay Range: $45/hr - $50/hr
**Exact compensation may vary based on several factors, including skills, experience, and education.
**Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
Description:
The position is responsible for providing ongoing maintenance and support of complex Java applications and developing enhancements to applications supported within our department. The resource is integral to developing and maintaining automated processes, streamlining critical business processes, data integrity, SEM/SUITE compliance, and securing the applications. Not having a resource on staff will lead to the organization manually documenting and developing screen plans that can lead to errors causing data integrity issues and can eventually lead to incorrect information being processed and reported.
• Write well-designed, testable code using spring MVC, Hibernate framework for
entity object mapping, jQuery/HTML5, JavaScript, HTML, XML, Angular.
• Develop business application components using Object Oriented java/JEE technologies, design principles.
• Design and develop RESTful Web Services using Spring Web MVC framework.
• Design, Develop and maintain applications using Apache Struts framework.
• Ability to implement design patterns like Intercepting Filter, Front Controller, Session Façade, DAO, Singleton, and Service Locator
• Proficient in building and maintaining unit test framework with Junit and Spring Boot.
• Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
• Develop database objects including stored procedures, functions,
• Troubleshoot issues using SQL, PL/SQL scripts.
• Experience tuning SQL queries and scripts
• Hands-on experience Integrating software components into a fully functional software system.
• Extensive knowledge on source control tools such as GIT
• Experience on Build Frameworks like Maven, maintain source code using source control, and baseline software versions and building war files for deployments
• Experience developing with application servers Apache Tomcat, JBOSS, WebSphere, and OpenShift Container.
• Develop software design documents and work with stakeholders for review and approval.
• Experience developing, prototypes, mockup for users review and approval.
• Experience creating flowcharts, screen layouts and documentation to ensure logical flow of the system requirements.
• Experience with React.js and modern JavaScript (ES6+).
• Understanding of Redux, React Router, and component lifecycle.
• Experience working on large agile projects.
• Experience with Java 17+, SOAP Web-Services and Java Messaging Service(JMS) API.
• Experience with Spring Boot Projects, Spring Data, Spring Batch, Spring Security frameworks
Qualifications:
- 5+ years developing complex computer systems using Java.
- 5+ years developing complex computer systems using java IDEs such as Eclipse and STS.
- 5+ years programming using Java JEE Struts Framework.
- 5+ years programming in SQL and/or PL/SQL.
- 5+ years programming using Java JEE Spring/SpringBoot Framework 3.0.
- 5+ years of development using Hibernate/JPA framework.
- 3+ years in projects development using Angular/React JS, JavaScript framework.
- 3+ years programming in the JBOSS Enterprise SOA environment including JBOSS Workflow.
- 3+ years using CMM/CMMI Level 3 methods and practices.
- 2+ years implemented agile development processes including test driven development.
- 2+ years of experience with React.js and modern JavaScript (ES6+).
- Understanding of Redux, React Router, and component lifecycle.
- Exposure to DevOps practices and cloud platforms (AWS, Azure).
- Hands-on experience using AI to accelerate daily coding tasks, including code generation, refactoring and documentation.
Immediate need for a talented IT Software Developer Sr./Go Developer . This is a 12+ Months Contract opportunity with long-term potential and is located in Columbus, OH (Remote). Please review the job description below and contact me ASAP if you are interested.
Job ID:26-08816
Pay Range: $67 - $68.57/hour. Employee benefits include, but are not limited to, health insurance (medical, dental, vision), 401(k) plan, and paid sick leave (depending on work location).
Key Responsibilities:
- Participate in work planning and management activities as part of a team utilizing agile methodology (work items managed in GitHub, team stand-up meetings, and other sprint ceremonies).
- Document and track assignments according to team standards and using team tools.
- Perform daily application development activities (designing solution, programming, testing, collaborating with peers etc.)
Key Requirements and Technology Experience:
- Must have skills: - Golang, AWS, Monitoring and Observability Tools, CI/CD, AWS RDS, PostgreSQL, DynamoDB, Blue/Green, Canary.
- Advanced skills in Go programming.
- Expertise in public cloud and associated toolsets (AWS, Azure, Terraform, Kubernetes/OpenShift)
- Expertise in CI/CD and release engineering tooling (Azure DevOps, GitHub Actions) and deployment patterns (blue/green, canary)
- Experience with monitoring and observability tools (Splunk, App Dynamics, Open Telemetry) and a working knowledge of networking and security
- Database expertise (AWS RDS, PostgreSQL, DynamoDB)
- Ability to hit the ground running while working independently and in a collaborative manner by taking full ownership.
- Demonstrated ability to leverage AI tools effectively without over-reliance, maintaining sound independent judgment and critical thinking.
- Hands on coding and development position.
- Comfortability navigating the unknown in a small, self-directed agile team is a plus.
- Good communication skills with ability to translate technical and non-technical manner a plus.
Our client is a leading Insurance Industry and we are currently interviewing to fill this and other similar contract positions. If you are interested in this position, please apply online for immediate consideration.
Pyramid Consulting, Inc. provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.
By applying to our jobs you agree to receive calls, AI-generated calls, text messages, or emails from Pyramid Consulting, Inc. and its affiliates, and contracted partners. Frequency varies for text messages. Message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You can reply STOP to cancel and HELP for help. You can access our privacy policy here.
Role: Cybersecurity Engineer
Location: Atlanta, GA (Hybrid)
Duration: 6 Months
Job Responsibilities / Typical Day in the Role
- Implement and maintain WAF protections across web/API properties.
- Write and tune WAF rules (e.g., custom rules, bot controls, rate limits).
- Analyze logs/alerts to identify malicious patterns and false positives; adjust policies.
- Collaborate with product/engineering to integrate WAF in the SDLC and CI/CD.
- Build automation and capabilities via code to support WAF program
- Document runbooks, change procedures, and playbooks for common scenarios.
- Participate in on call rotation
Must Have Skills / Requirements
Experience with Cloud Environments
- 2+ years of experience
Scripting Experience
- 2+ years of experience; Python/Bash/Go Lang – at least one of these
IaC Experience
- 2+ years of experience working with infrastructure as code.
Nice to Have Skills / Preferred Requirements
- Terraform/CloudFormation for WAF config as code; CDN integrations.
- Security certs (e.g., GIAC/GWAPT, CISSP/CSSLP) are a plus but not required.
- Oracle Cloud or GCP experience.
Job Summary
We are seeking a skilled Data Engineer with 5+ years of hands-on experience designing, building, and maintaining scalable data pipelines and data platforms. The ideal candidate has strong experience working with DAG-based orchestration, cloud technologies (preferably Google Cloud Platform), SQL-driven data processing, Apache Spark, and Python-based API development using Fast API. You will play a key role in enabling reliable data ingestion, transformation, and quality assurance across enterprise systems.
Key Responsibilities
- Design, develop, and maintain DAG-based data pipelines (Airflow or similar orchestration tools).
- Build and optimize SQL-based data transformations for analytics and reporting.
- Develop and manage batch and streaming data pipelines using Apache Spark.
- Implement Python-based REST APIs using FastAPI for data services and integrations.
- Perform data quality checks, validation, reconciliation, and anomaly detection.
- Work with cloud platforms (preferably Google Cloud Platform) for storage, compute, and orchestration.
- Architect and implement cloud-native data platforms on GCP, leveraging services such as BigQuery, BigTable, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.
- Monitor pipeline performance, troubleshoot failures, and optimize processing efficiency.
- Collaborate with analytics, application, and business teams to understand data requirements.
- Ensure best practices around security, scalability, and maintainability.
- Ensure data quality, reliability, security, governance, and compliance with enterprise standards
Required Skills & Experience
- 5 + years of experience as a Data Engineer
- Strong experience with DAG orchestration (e.g., Apache Airflow).
- Solid understanding of cloud technologies, preferably Google Cloud Platform (GCP).
- Advanced proficiency in SQL for data processing and transformations.
- Hands-on experience running and tuning Apache Spark jobs.
- Experience developing APIs using Python and FastAPI.
- Strong understanding of data quality frameworks, checks, and validation techniques.
- Proficiency in Python, Java, Scala, or PySpark, with strong SQL expertise.
- Hands-on experience with GCP data services, including BigQuery, BigTable, Dataproc, Dataflow, and cloud-native ETL patterns.
- Experience with software delivery methodologies such as Agile, Scrum, and CI/CD practices.
- Strong analytical and problem-solving skills.
- Ability to work independently and in cross-functional teams.
- Good communication and documentation skills.