Stack Using Array Definition In Data Structure Jobs Near, ME Jobs in Usa
45,180 positions found — Page 2
Many candidates spend months learning frameworks and finishing courses, only to freeze during technical screens, system questions, or behavioral rounds.
The result is painful: "almost hired” over and over again, while the confidence drops.
The truth is that interviewing is its own skill, and most bootcamps don't teach it deeply.
They teach how to code—but not how to think out loud, structure answers, debug in real time, defend trade-offs, and communicate like an engineer.
Since 2010, SynergisticIT has helped candidates land full-time roles with many major employers.
The best way to understand this: you can be smart and still fail interviews if you don't know what the interview is truly measuring.
Interviews rarely test "can you write code at home.” They test: Can you solve problems under constraints and time pressure? Can you communicate your approach clearly? Can you handle edge cases and complexity? Can you explain trade-offs and design choices? Can you show job-ready project depth, not just toy examples? SynergisticIT focuses on roles such as entry-level software programmers, Java full stack developers, Python/Java developers, Data Analysts, Data Engineers, Data Scientists, and Machine Learning Engineers.
The focus areas include Java / Full Stack / DevOps and Data tracks like Data Engineering, Data Analytics/BI, ML/AI, because those are the roles employers continue to hire for.
If your pattern is "I reach interviews but don't clear them,” you likely need three upgrades: Stronger project narratives (what you built, why it matters, how it works) Stronger technical foundations (DSA, OOP, APIs, SQL, pipeline design) Mock interview reps (realistic simulation, feedback, improvement loops) Many jobseekers underestimate how much hiring is about clarity.
You don't need to be perfect—you need to show you can think, collaborate, and deliver.
That's why guided mock interviews and structured interview coaching can be a game-changer.
Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Ideal candidates for this version include: Candidates who get interviews but repeatedly fall short Jobseekers stuck in "screen round limbo” Developers who panic during live coding Candidates who can build projects but struggle to explain them Professionals who haven't interviewed in years and feel rusty Career changers who fear "I'm behind CS grads” (often untrue with support) SynergisticIT provides support for candidates navigating STEM OPT extension, H1B filing, and Green Card processes (where applicable), which can matter when timing is critical.
Event videos (OCW, JavaOne, Gartner): USA Today feature If you're tired of failing interviews and want a structured plan to convert interviews into offers, start here: Contact SynergisticIT: Because getting hired isn't about trying harder—it's about preparing smarter, practicing correctly, and having the right guidance.
Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume.
Description
Meditech Data Extraction & Reporting Engineer
Position Overview
We are seeking an experienced Meditech data engineering specialist to support healthcare data archiving and legacy application retirement projects.
This role focuses on extracting data from legacy Meditech systems and transforming it into relational SQL databases while also preserving Meditech reporting outputs and report logic for use within an archive platform.
The ideal candidate has deep experience with Meditech Client/Server data structures, NPR reporting frameworks, and the Meditech data dictionary. This individual will design and implement processes that convert Meditech hierarchical data structures into normalized relational schemas and enable reproduction or preservation of Meditech reports within a long-term archive environment.
_____________________________________
Required Qualifications
Meditech Platform Experience
Strong hands-on experience working with Meditech environments such as (but not limited to):
• Meditech Client/Server
• Meditech Magic
• Meditech 6.x
Experience working with:
• Meditech DPM structures
• NPR reporting systems
• Meditech dictionaries and pointer relationships
• Meditech segment layouts
_____________________________________Technical Skills
• Advanced SQL development experience
• Experience designing relational database schemas
• Experience translating hierarchical data models into relational structures
• Experience building data extraction and transformation pipelines
Job Type & Location
This is a Contract position based out of Columbus, OH.
Pay and BenefitsThe pay range for this position is $55.00 - $80.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
• Medical, dental & vision
• Critical Illness, Accident, and Hospital
• 401(k) Retirement Plan – Pre-tax and Roth post-tax contributions available
• Life Insurance (Voluntary Life & AD&D for the employee and dependents)
• Short and long-term disability
• Health Spending Account (HSA)
• Transportation benefits
• Employee Assistance Program
• Time Off/Leave (PTO, Vacation or Sick Leave) Workplace Type
This is a fully remote position.
Application DeadlineThis position is anticipated to close on Mar 20, 2026.
h4> About TEKsystems:Were partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. Thats the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global ServicesWe’re a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We’re a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We’re strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We’re building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at .
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
You've done a ton of Leetcode.
You've racked up certificates, aced LeetCode challenges, and you know your way around system design like the back of your hand.
On paper, you're everything a tech company wants.
However tech stacks and requirement change every day.
Since 2010, we've helped thousands of candidates land full-time jobs at tech leaders like Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Paypal, Banking, Wayfair, Client, Client and hundreds more with Job offers of $95k to $154k.
Synergisticit focuses on closing the gap between your tech skills and what employers want now.
Open Roles We're Hiring For our clients: Entry-Level Software Programmers (Java/Python) Java Full Stack Developers Data Analysts & BI Engineers Data Scientists & ML Engineers All visa types and U.S.
citizens are encouraged to apply.
Note: Internships, freelance, or personal projects will not be considered toward experience requirements.
If you submit your resume, please be advised it may be entered into a central database shared by our JOPP team (our placement program).
You may unsubscribe if you receive emails.
Check the links below: Please check the below links: SynergisticIT USA Today Article Videos of Synergisticit At OCW, JAVAONE, GARTNER SUMMIT We Focus on Java /Full stack/Devops and Data Science /Data Engineers/Data analysts/BI Analysts/ Machine learning/AI candidates Ideal Candidates: Recent grads in CS, Engineering, Math, or Statistics with limited or no job experience Jobseekers who had layoffs due to Downsizing and want to get in demand tech stack Professionals seeking a career switch to tech Candidates with career gaps or lacking real-world experience Individuals looking to boost their skill portfolio for better job prospects Computer Science grads with limited or no job experience Students who recently finished their Bachelor's or Master's programs Those struggling to land interviews despite having experience Candidates on F1/OPT needing a job for STEM extension or H-1B filing Currently, We are looking for entry-level software programmers, Java Full stack developers, Python/Java developers, Data analysts/Data Engineers/ Data Scientists, Machine Learning engineers for full time positions with clients.
Top tech companies are flooded with smart grads.
What gets you in the door now is real-world application, confidence in delivery, and the soft skills to own a room—or a Zoom.
please check the below links Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT Technical Skills or Experience? | Which one is important to get a Job? | SynergisticIT Backend vs.
Full Stack Development: Job Prospects | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Is AI Going to Replace Software Programmers? | SynergisticIT The Market's Changed—Have You? Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume.
Job Description:
We are seeking multiple Data Center Logistic Technicians to join a large-scale data center operations team in Sandston, VA. These positions support high‑volume data center activity focused on inventory management, logistics, ticket resolution, and work‑order execution. Ideal candidates will have experience in IT support, warehouse operations, or data center environments, along with a strong attention to detail and a commitment to accuracy, safety, and operational excellence. This role is well‑suited for individuals looking to grow in data center operations, infrastructure support, or technology logistics.
Role Responsibilities:
- Inventory & Asset Management:
- Perform receipt, inventory control, cycle counts, and barcode management for data center assets.
- Maintain accurate documentation and audit materials following all required processes.
- Assist with staging, stocking, and material distribution activity across the data center campus.
- Logistics & Work Execution:
- Perform material movement, labeling, organization, and inventory staging.
- Prepare hardware for deployment, including basic physical installation tasks.
- Follow documented procedures for safe, accurate execution of tasks.
- Cross‑Functional Collaboration:
- Partner with engineering and operational teams to ensure proper material flow and documentation.
- Support contractor/vendor coordination and maintain clean, organized work areas.
- Provide feedback to supervisors and teams on process improvements and operational issues.
- Compliance & Safety:
- Adhere to all PPE, safety, and high‑visibility gear requirements.
- Maintain compliance with data center security, access, and operational protocols.
- Accurately complete all assigned paperwork, reports, and inventory documentation.
Additional Skills & Qualifications:
- Experience in data center logistics, warehouse operations, IT support, or ticket‑based work management.
- Strong organizational skills with the ability to multitask in a high‑volume environment.
- Familiarity with IT hardware components and general data center operations preferred.
- Ability to lift up to 40 lbs and work on feet throughout the shift.
- Comfortable working in an active data center environment with strict security and PPE requirements.
- Must be reliable, punctual, and able to follow structured processes precisely.
- Ability to work independently with minimal direction.
Employee Value Proposition:
This is an excellent opportunity to build a long‑term career within the data center industry. Technicians will gain hands‑on experience in material handling, logistics operations, ticket system workflows, and hardware lifecycle processes. This role offers long‑term stability and growth potential within an expanding data center environment.
Work Environment:
- Work will be performed onsite at a major data center campus.
- Technicians must be comfortable navigating a large, active facility.
- PPE is required; steel‑toe boots must be self‑provided, while other gear may be provided on‑site.
- Work schedules may include 10‑hour shifts and weekend rotations depending on business needs.
This is a Contract to Hire position based out of Sandston, VA.
Pay and BenefitsThe pay range for this position is $18.00 - $24.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
• Medical, dental & vision
• Critical Illness, Accident, and Hospital
• 401(k) Retirement Plan – Pre-tax and Roth post-tax contributions available
• Life Insurance (Voluntary Life & AD&D for the employee and dependents)
• Short and long-term disability
• Health Spending Account (HSA)
• Transportation benefits
• Employee Assistance Program
• Time Off/Leave (PTO, Vacation or Sick Leave)
This is a fully onsite position in Sandston,VA.
Application DeadlineThis position is anticipated to close on Mar 20, 2026.
h4>About TEKsystems:We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We’re a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We’re a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We’re strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We’re building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at .
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
“Let goodness, fairness, and most importantly, love prevails in business; profits will inevitably follow.” – NK Chaudhary, founder
What we do for our team members:
- Comprehensive Benefits: Company Paid Holidays, PTO, Parental Involvement Leave, Maternity/Paternity Leave, EAP, No Cost Employee Medical Plan, Vision, Dental, and Company Paid Life Insurance. We also include a match on retirement (401K/Roth).
- Career Development: We're committed to providing growth for career development within the company, supporting our team members' aspirations with a well-defined succession plan that includes a variety of training and development opportunities.
- Pet-Friendly Workplace: We welcome your furry friends! Our 'Bring Your Dogs to Work' policy creates a pet-friendly atmosphere, allowing our team members to enjoy the companionship of their dogs during the workday.
- Wellness Support: Not only do we support an active lifestyle with our on-site basketball court and yoga studio, but we host quarterly mental health events to assist in creating a well-rounded work-life harmony for our team members.
- Sustainability Efforts: Reuse, Renew, and Refresh by joining our Green Team! Responsible for harvesting from the organic community garden, donating goods to local pet shelters and schools, creating educational workshops, leading nature walks, and much more, they promote well-being through sustainable practices.
Our Values
Empowerment • Inclusiveness • Responsibility • Progressive
Learn more about our company story here: Jaipur Rugs Foundation
Since 2004, the Jaipur Rugs Foundation has worked to improve the lives of rug-weaving artisans in India. This is done through training, skills development, and social interventions. By focusing on the ideas and solutions that create social value, the Foundation supports the dignity and heritage of these traditional artisans, believing that healthy and sustainable communities are key to the survival of traditional rug weaving. Jaipur Living has made ethical and socially conscious global citizenship the foundation of its business. Through social initiatives and the Jaipur Rugs Foundation, the company supports a supplier ecosystem without a middleman of more than 40,000 artisans in 700 villages across India by providing them with a livable wage, access to health care, leadership education, and opportunities for personal growth and development. Combining time-honored techniques and of-the-moment trends, every Jaipur Living product is as ethically and responsibly made as it is beautiful.
Learn more about the Jaipur Rugs Foundation here: are a fast-growing, design-led B2B home décor and textiles brand with big ambitions. Over the last 12 months, we have revolutionized our technical foundation, investing in Microsoft Dynamics 365 (F&O) and a Microsoft Fabric ecosystem. We are now looking for a seasoned leader to refine our existing infrastructure, optimize our end-to-end data workflows, and bridge the gap between "raw data" and "reliable business intelligence."
This role demands a strong balance of technical depth and operational management. While you must possess expert-level proficiency in data engineering, specifically within the Microsoft Fabric ecosystem and modern data platforms, we also need a leader who is experienced in analytics, data visualization, BI, and translating business needs into analytical solutions. You will be responsible for defining and executing an outcome-based Data & Analytics strategy, building and developing a global team of data engineers, BI developers, and data analysts, and ensuring the company has trusted, scalable, and decision-ready data at every level of the organization. The ideal candidate is a Fabric-certified or Fabric-trained leader, an exceptional communicator, and a proven people manager who can balance hands-on technical depth with strategic leadership.
Key Responsibilities:
Strategic Management & Outcome-Based Delivery
- Tactical Roadmap: Develop and execute a multi-year roadmap that aligns data engineering, BI, and advanced insights with business priorities (e.g., inventory efficiency, margin protection, and growth).
- Process Standardization: Define what “good” looks like for data reliability, documentation, insight quality, and business impact
- Baseline Maturity: Shift the organization from ad-hoc reporting to repeatable, trusted, decision-ready data products
- Advance Automation: Assess the current-state landscape and define a clear path from foundational reporting to automated, predictive analytics.
- Executive Communication: Serve as the single point of accountability for all data and analytics capabilities, translating technical progress into business-relevant implications across the organization
Infrastructure Optimization & Fabric Engineering
- Systemic Optimization: Lead the audit and refinement of the existing Fabric environment (Lakehouse, Pipelines, Notebooks) to improve overall performance, stability, and refresh reliability
- Engineering Standards: Set the "gold standard" for architecture, data modeling, testing, and deployment (CI/CD), ensuring the stack is hardened for enterprise-scale growth
- Reduce Manual Effort: Minimize operational risk by standardizing pipelines, refresh processes, and metric calculations
- Automation & Reliability: Systematically identify and eliminate manual reporting and spreadsheet-based workflows through robust automation in PySpark and Fabric
- Proactive Governance: Establish monitoring, alerting, and exception-handling processes to manage data quality and refresh failures before they impact the business
Analytics & Decision Enablement
- High-Quality BI Delivery: Oversee the design and delivery of visually appealing Power BI dashboards that simplify complexity and adhere to our design-led brand standards
- Metric Governance: Ensure KPI definitions and reporting logic are consistent across the company, acting as the arbiter of "the truth" for business metrics
- Advanced Analytics: Identify and operationalize high-value use cases for predictive analytics (e.g., demand forecasting, product lifecycle analysis) as platform maturity increases
- Business Translation: Partner with business leaders to translate business requirements into scalable, intuitive, impactful analytics solutions
- Business Evolution: Lead the transition from descriptive and diagnostic reporting to forward-looking insights that support planning and decision-making
Global Team Leadership & Talent Development
- People Leadership: Directly lead and develop a 3–5 person global team (primarily based in India), establishing clear roles, accountability, and a high-performance culture
- Skill Development: Create career paths and skill-development plans for engineers and analysts to ensure consistent, high-quality delivery
- Operating Model: Build a scalable offshore capability that delivers at speed while maintaining rigorous standards for code quality and documentation
Skills & Minimum Qualifications:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of knowledge, skill, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform essential functions.
- 10+ years of experience in data engineering, analytics, or BI, with director-level scope or equivalent ownership
- Deep hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Notebooks, semantic models)
- Fabric certification or formal Fabric training strongly preferred
- Strong experience with PySpark and Spark-based transformations
- Strong understanding of Azure data services and modern data architectures
- Exceptional dashboard-development skills using Power BI; portfolio-quality experience preferred
- Strong understanding of data storytelling, executive-ready visualization, and intuitive UI/UX design
- Experience gathering business requirements and translating them into analytical products
- Proven experience leading and developing global / offshore teams
- Strong communicator with the ability to influence at senior levels
- Experience supporting ERP-driven environments; Dynamics 365 preferred
- Ability to juggle strategy, execution, and stakeholder communication simultaneously
Success Measures (First 12–18 Months)
- Strategy Execution: An outcome-based Data & Analytics strategy that is fully operational and tied to business outcomes
- Optimized Infrastructure: A trusted, scalable Fabric platform with significantly reduced manual reporting and 99%+ data availability
- Dashboard Adoption: A suite of high-quality dashboards used daily and weekly by business leaders to drive decision-making
- Team Growth: A high-performing global team with a track record of delivering complex analytics products with speed and precision
Physical Requirements:
- Remaining in a seated position for long periods of time
- Standing is to remain on one’s feet in an upright position without moving about
- The ability to alternate between sitting and standing is present when a worker has the flexibility to choose between sitting or standing as needed when this need cannot be accommodated by schedules breaks and/or lunch period
- Lifting and transporting items that could weight up to 25 pounds
- Entering text or data into a computer by means of a traditional keyboard
- Expressing or exchanging ideas by means of the spoken work to impart oral information to clients and talent and convey detailed spoken instructions to other workers accurately and quickly
- The ability to hear, understand, and distinguish speech and/or other sounds such as in person and telephone
- Clarity of vision to see computer screens and workspace
Sr. Full Stack Engineer
Job ID
2025-2140
# of Openings
1
Overview
Currently seeking multiple Full Stack Developers in support of the of U.S. Citizenship and Immigration Services (USCIS) Engineering Support for Identity Services (ESIS), this individual will support Agile Application Development technologies and capabilities in the areas of software development, systems engineering, integration, and test of software applications and infrastructure. Will be skilled with front-end, back-end, and database development. Design and implement full stack cloud solutions to include IaaS, PaaS, and SaaS. Design and deploy computing infrastructure, physical or virtual machines and other resources like virtual-machine disk image library, block and file-based storage, firewalls, load balancers, IP addresses, virtual local area networks. Implement cloud-based platform services for AWS. Implement cloud-based software as service for AWS. Perform DevOps functions.
Key Skills:
- 10+ years of experience with full stack engineering with proficiency in database development/integration as well as server and client application development/integration
- Software developing experience using Python and Java Spring framework
- Experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- AWS experience required with experience deploying enterprise applications in AWS
- Experience with CI/CD environment tools such as Docker, Jenkins, Ansible, Kubernetes
Responsibilities
- Software development with Python, Java, React, and various scripting languages
- Design data models and web APIs and creation of software tasks from system requirements
- Perform requirements analysis, design, development, unit, and integration testing of software, troubleshooting and debugging of the system
- Immediate responsibilities will include enhancing and maintaining the existing system as well as design, development, and documentation of new features
- Create Git Releases, pull request and code reviews
- Query logs utilizing Splunk and will monitor dashboarding utilizing New Relic
- Usage of Atlassian Tools for day to day tasks within the Scrum process
- Implement web services, data persistence access features and external interfaces
- Partner closely with front-end and database engineers to ensure features are developed holistically
- Follow Agile software development methodology and team architecture standards.
- Will need to be able to read Architecture Diagrams
- Perform test service to improve code coverage, mocking services, test driven development and unit testing
- Will modify Helm Charts, Jenkinsfiles, and Dockerfiles
Qualifications
- MUST BE US CITIZEN
- Bachelor's degree required
- Must be able to obtain and maintain a Public Trust security clearance
- 10+ years expereince in Software Engineering
- Must have experience in Python and Java Spring Framework (Boot, Batch, Data, Security)
- Must have experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- Experience with design, development, enhancement, troubleshooting and debugging of web applications
- Must have experience in AWS cloud environment and with CI/CD tools (ie. Docker, Jenkins, Kubernetes) for deployment processes, monitoring production environments, and modifying docker/Jenkins files and helm charts
- Experience with scripting languages (Python, Bash, Powershell, Perl) is not required but nice to have
- Understanding of the concept of branching and utilizing technological tools such as Git, VS Code, and/or Rancher to perform
- Experience with creating Git releases, creating pull requests, and reviewing code
- Experience monitoring dashboards utilizing New Relic
- Experience with Splunk to query logs
- Experience with Junit testing preferred
- Experience creating release instructions utilizing JIRA
- Experience developing and integrating complex software systems through the full SDLC
- Experience with Agile Scrum
- Must have strong written and verbal communication skills
Target Pay Range
The below listed pay range for this position is not a guarantee of compensation or salary. The final offered salary will be influenced by a host of factors including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at Pyramid Systems that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits, to include our Employee Stock Ownership Program, FlexPTO, and learning and development opportunities.
Pyramid Min
USD $125,731.00/Yr.
Pyramid Max
USD $188,597.00/Yr.
Why Pyramid?
Pyramid Systems, Inc. is an award-winning, technology leader, driving digital transformation across federal agencies. We empower forward-thinking innovations, accelerate production-ready software, and deliver secure solutions so federal agencies can meet their mission goals. Voted a Top Workplace, both regionally (Washington, DC) and Nationally (USA) the past 2 years (2023 and 2024) based on the feedback from our employees, we are headquartered in Fairfax, VA. and have a growing national footprint. We value and promote our Flexible Workplace approach because of the positive impacts it has on work-life integration. We remain committed to ensuring every employee's voice is heard, performance and results are recognized and rewarded, development and advancement is a focus, and diversity, equity and inclusion is a company priority. We offer competitive compensation and benefits (including a recently launched Employee Stock Ownership Plan - ESOP), a robust performance-based rewards program, and we know how to have fun! Our people and culture have endured and delivered for our clients for nearly three decades.
EEO Statement
Pyramid Systems, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
About Wakefern
Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.
Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.
The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.
Essential Functions
- Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
- Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
- Provide input for project plans and timelines to align with business objectives.
- Monitor project progress, identify risks, and implement mitigation strategies.
- Work with cross-functional teams and ensure effective communication and collaboration.
- Provide regular updates to the management team.
- Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
- Communicates and promotes the code of ethics and business conduct.
- Ensures completion of required company compliance training programs.
- Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
- Stays current through personal development and professional and industry organizations.
Responsibilities
- Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
- Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
- Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
- Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
- Ensure data solutions and data sources meet quality, security, and compliance standards.
- Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
- Provide technical training, documentation, and ongoing support to end users of data automation systems.
- Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.
Qualifications
- A bachelor's degree or higher in computer science, information systems, or a related field.
- Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
- Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
- Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
- Experience with workflow orchestration tools such as Cloud Composer or Airflow
- Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
- Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
- Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
- Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
- Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
- Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
- Hands-on experience with IBM DataStage and Alteryx is a plus.
- Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
- Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
- Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
- Familiarity with data modeling tools.
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
- Familiarity with DevOps practices for data (CI/CD pipelines)
- Strong knowledge and skills in data management, data quality, and data governance.
- Strong communication, collaboration, and problem-solving skills.
- Ability to work on multiple projects and prioritize tasks effectively.
- Ability to work independently and in a team environment.
- Ability to learn new technologies and tools quickly.
- The ability to handle stressful situations.
- Highly developed business acuity and acumen.
- Strong critical thinking and decision-making skills.
Working Conditions & Physical Demands
This position requires in-person office presence at least 4x a week.
Compensation and Benefits
The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.
Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.
Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements
Translate business process designs into clear master and transactional data definitions for S/4HANA.
Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.
Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).
Define data objects, attributes, and mandatory fields.
Support business rules, validations, and derivations.
Align data structures to SAP best practices and industry standards.
Support data cleansing, enrichment, and harmonization activities.
Define and validate data mapping rules from legacy systems to S/4HANA.
Participate in mock conversions, data loads, and reconciliation activities.
Ensure data quality thresholds are met prior to cutover.
Support the establishment and enforcement of global data standards and policies.
Work closely with Master Data and Data Governance teams.
Help define roles, ownership, and stewardship models for value stream data.
Contribute to data quality monitoring and remediation processes.
Support functional and integrated testing with a strong focus on data accuracy.
Validate business scenarios using migrated and created data.
Support cutover planning and execution from a data perspective.
Provide post-go-live support and stabilization.
Requirements: 5 years of SAP functional experience with a strong data focus.
Hands-on experience with SAP S/4HANA (greenfield preferred).
Proven involvement in large-scale, global ERP implementations.
Deep understanding of value stream business processes and related data objects.
Experience supporting data migration, cleansing, and validation.
Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).
Understanding of S/4HANA data model changes vs.
ECC.
Experience working with SAP MDG or similar governance tools preferred.
Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).
Ability to read and interpret functional specs and data models.
Strong stakeholder management and communication skills.
Ability to work across global, cross-functional teams.
Detail-oriented with strong analytical and problem-solving skills.
Comfortable operating in a fast-paced transformation environment.
Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.
Prior role as Functional Data Lead or Data Domain Lead.
Experience defining global templates and harmonized data models.
Knowledge of data quality tools and metrics.
Experience with MGD and setting up cost center and profit center groups.
They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).
This is not an SSIS-only role.
The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.
This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.
The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.
This requires architectural thinking and hands-on technical execution.
Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.
Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.
Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.
6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.
6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.
6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.