Stack Using Array Definition In Data Structure Jobs Hiring Now Jobs in Usa

39,111 positions found — Page 2

Data Quality Analyst / Data Steward
✦ New
Salary not disclosed
Montgomery 1 day ago
Job Requisition: Data Quality Analyst / Data Steward Contract Length: Long Term – Potential renewal each fiscal year Work Location: 100% onsite – Montgomery, AL Candidate Profile Experienced data professional capable of building, advancing, and scaling data quality and governance foundations from scratch.

Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.

Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.

Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.

Drive long term improvement of data standards, definitions, lineage, and quality processes.

Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.

Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.

Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.

Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.

Conduct root cause analysis and implement preventive and long term remediation solutions.

Optimize SQL queries, tune stored procedures, and improve data processing performance.

Document audit findings, validation processes, data flows, standards, and quality reports.

Build dashboards and reports for data quality KPIs using Power BI/Tableau.

Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.

Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.

Ensure proper and consistent data usage across departments and systems.

Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.

Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.

Provide training on data entry, data handling, stewardship practices, and data literacy.

Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.

GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.

Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.

Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.

Drive adoption of data quality and governance practices across business and technical teams.

Support long term evolution of enterprise data strategy and governance maturity.

Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.

SSIS development, deployment, and troubleshooting.

Data profiling, validation rule design, quality scoring, and measurement techniques.

ETL/ELT pipeline design, debugging, and optimization.

Data modeling (conceptual, logical, physical).

Metadata management and lineage documentation.

Reporting and dashboarding with Power BI, Tableau, or similar tools.

Strong documentation and communication skills.

Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.

Experience in low maturity/green field data environments.

Familiarity with AI/ML data readiness and feature store aligned data structuring.

Cloud data engineering exposure (Azure, Databricks, GCP).

Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.

Master’s degree preferred.

Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Not Specified
Junior Java developer/Entry level Data Scientist/AI engineer
Salary not disclosed
New York 3 days ago
500+ Leetcode Problems! —Still No Offers? Let's Get You Offers with SynergisticIT.

You've done a ton of Leetcode.

You've racked up certificates, aced LeetCode challenges, and you know your way around system design like the back of your hand.

On paper, you're everything a tech company wants.

However tech stacks and requirement change every day.

Since 2010, we've helped thousands of candidates land full-time jobs at tech leaders like Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Paypal, Banking, Wayfair, Client, Client and hundreds more with Job offers of $95k to $154k.

Synergisticit focuses on closing the gap between your tech skills and what employers want now.

Open Roles We're Hiring For our clients: Entry-Level Software Programmers (Java/Python) Java Full Stack Developers Data Analysts & BI Engineers Data Scientists & ML Engineers All visa types and U.S.

citizens are encouraged to apply.

Note: Internships, freelance, or personal projects will not be considered toward experience requirements.

If you submit your resume, please be advised it may be entered into a central database shared by our JOPP team (our placement program).

You may unsubscribe if you receive emails.

Check the links below: Please check the below links: SynergisticIT USA Today Article Videos of Synergisticit At OCW, JAVAONE, GARTNER SUMMIT We Focus on Java /Full stack/Devops and Data Science /Data Engineers/Data analysts/BI Analysts/ Machine learning/AI candidates Ideal Candidates: Recent grads in CS, Engineering, Math, or Statistics with limited or no job experience Jobseekers who had layoffs due to Downsizing and want to get in demand tech stack Professionals seeking a career switch to tech Candidates with career gaps or lacking real-world experience Individuals looking to boost their skill portfolio for better job prospects Computer Science grads with limited or no job experience Students who recently finished their Bachelor's or Master's programs Those struggling to land interviews despite having experience Candidates on F1/OPT needing a job for STEM extension or H-1B filing Currently, We are looking for entry-level software programmers, Java Full stack developers, Python/Java developers, Data analysts/Data Engineers/ Data Scientists, Machine Learning engineers for full time positions with clients.

Top tech companies are flooded with smart grads.

What gets you in the door now is real-world application, confidence in delivery, and the soft skills to own a room—or a Zoom.

please check the below links Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT Technical Skills or Experience? | Which one is important to get a Job? | SynergisticIT Backend vs.

Full Stack Development: Job Prospects | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Is AI Going to Replace Software Programmers? | SynergisticIT The Market's Changed—Have You? Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.

Resume submissions may be shared with our JOPP team database also.

Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume.
Not Specified
Director of Data and Analytics
✦ New
Salary not disclosed
Acworth, GA 1 hour ago

“Let goodness, fairness, and most importantly, love prevails in business; profits will inevitably follow.” – NK Chaudhary, founder


What we do for our team members:

  • Comprehensive Benefits: Company Paid Holidays, PTO, Parental Involvement Leave, Maternity/Paternity Leave, EAP, No Cost Employee Medical Plan, Vision, Dental, and Company Paid Life Insurance. We also include a match on retirement (401K/Roth).
  • Career Development: We're committed to providing growth for career development within the company, supporting our team members' aspirations with a well-defined succession plan that includes a variety of training and development opportunities.
  • Pet-Friendly Workplace: We welcome your furry friends! Our 'Bring Your Dogs to Work' policy creates a pet-friendly atmosphere, allowing our team members to enjoy the companionship of their dogs during the workday.
  • Wellness Support: Not only do we support an active lifestyle with our on-site basketball court and yoga studio, but we host quarterly mental health events to assist in creating a well-rounded work-life harmony for our team members.
  • Sustainability Efforts: Reuse, Renew, and Refresh by joining our Green Team! Responsible for harvesting from the organic community garden, donating goods to local pet shelters and schools, creating educational workshops, leading nature walks, and much more, they promote well-being through sustainable practices.


Our Values

Empowerment • Inclusiveness • Responsibility • Progressive

Learn more about our company story here: Jaipur Rugs Foundation

Since 2004, the Jaipur Rugs Foundation has worked to improve the lives of rug-weaving artisans in India. This is done through training, skills development, and social interventions. By focusing on the ideas and solutions that create social value, the Foundation supports the dignity and heritage of these traditional artisans, believing that healthy and sustainable communities are key to the survival of traditional rug weaving. Jaipur Living has made ethical and socially conscious global citizenship the foundation of its business. Through social initiatives and the Jaipur Rugs Foundation, the company supports a supplier ecosystem without a middleman of more than 40,000 artisans in 700 villages across India by providing them with a livable wage, access to health care, leadership education, and opportunities for personal growth and development. Combining time-honored techniques and of-the-moment trends, every Jaipur Living product is as ethically and responsibly made as it is beautiful.

Learn more about the Jaipur Rugs Foundation here: are a fast-growing, design-led B2B home décor and textiles brand with big ambitions. Over the last 12 months, we have revolutionized our technical foundation, investing in Microsoft Dynamics 365 (F&O) and a Microsoft Fabric ecosystem. We are now looking for a seasoned leader to refine our existing infrastructure, optimize our end-to-end data workflows, and bridge the gap between "raw data" and "reliable business intelligence."


This role demands a strong balance of technical depth and operational management. While you must possess expert-level proficiency in data engineering, specifically within the Microsoft Fabric ecosystem and modern data platforms, we also need a leader who is experienced in analytics, data visualization, BI, and translating business needs into analytical solutions. You will be responsible for defining and executing an outcome-based Data & Analytics strategy, building and developing a global team of data engineers, BI developers, and data analysts, and ensuring the company has trusted, scalable, and decision-ready data at every level of the organization. The ideal candidate is a Fabric-certified or Fabric-trained leader, an exceptional communicator, and a proven people manager who can balance hands-on technical depth with strategic leadership.


Key Responsibilities:

Strategic Management & Outcome-Based Delivery

  • Tactical Roadmap: Develop and execute a multi-year roadmap that aligns data engineering, BI, and advanced insights with business priorities (e.g., inventory efficiency, margin protection, and growth).
  • Process Standardization: Define what “good” looks like for data reliability, documentation, insight quality, and business impact
  • Baseline Maturity: Shift the organization from ad-hoc reporting to repeatable, trusted, decision-ready data products
  • Advance Automation: Assess the current-state landscape and define a clear path from foundational reporting to automated, predictive analytics.
  • Executive Communication: Serve as the single point of accountability for all data and analytics capabilities, translating technical progress into business-relevant implications across the organization

Infrastructure Optimization & Fabric Engineering

  • Systemic Optimization: Lead the audit and refinement of the existing Fabric environment (Lakehouse, Pipelines, Notebooks) to improve overall performance, stability, and refresh reliability
  • Engineering Standards: Set the "gold standard" for architecture, data modeling, testing, and deployment (CI/CD), ensuring the stack is hardened for enterprise-scale growth
  • Reduce Manual Effort: Minimize operational risk by standardizing pipelines, refresh processes, and metric calculations
  • Automation & Reliability: Systematically identify and eliminate manual reporting and spreadsheet-based workflows through robust automation in PySpark and Fabric
  • Proactive Governance: Establish monitoring, alerting, and exception-handling processes to manage data quality and refresh failures before they impact the business

Analytics & Decision Enablement

  • High-Quality BI Delivery: Oversee the design and delivery of visually appealing Power BI dashboards that simplify complexity and adhere to our design-led brand standards
  • Metric Governance: Ensure KPI definitions and reporting logic are consistent across the company, acting as the arbiter of "the truth" for business metrics
  • Advanced Analytics: Identify and operationalize high-value use cases for predictive analytics (e.g., demand forecasting, product lifecycle analysis) as platform maturity increases
  • Business Translation: Partner with business leaders to translate business requirements into scalable, intuitive, impactful analytics solutions
  • Business Evolution: Lead the transition from descriptive and diagnostic reporting to forward-looking insights that support planning and decision-making

Global Team Leadership & Talent Development

  • People Leadership: Directly lead and develop a 3–5 person global team (primarily based in India), establishing clear roles, accountability, and a high-performance culture
  • Skill Development: Create career paths and skill-development plans for engineers and analysts to ensure consistent, high-quality delivery
  • Operating Model: Build a scalable offshore capability that delivers at speed while maintaining rigorous standards for code quality and documentation

Skills & Minimum Qualifications:

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of knowledge, skill, and/or ability required. Reasonable accommodation may be made to enable individuals with disabilities to perform essential functions.

  • 10+ years of experience in data engineering, analytics, or BI, with director-level scope or equivalent ownership
  • Deep hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Notebooks, semantic models)
  • Fabric certification or formal Fabric training strongly preferred
  • Strong experience with PySpark and Spark-based transformations
  • Strong understanding of Azure data services and modern data architectures
  • Exceptional dashboard-development skills using Power BI; portfolio-quality experience preferred
  • Strong understanding of data storytelling, executive-ready visualization, and intuitive UI/UX design
  • Experience gathering business requirements and translating them into analytical products
  • Proven experience leading and developing global / offshore teams
  • Strong communicator with the ability to influence at senior levels
  • Experience supporting ERP-driven environments; Dynamics 365 preferred
  • Ability to juggle strategy, execution, and stakeholder communication simultaneously

Success Measures (First 12–18 Months)

  • Strategy Execution: An outcome-based Data & Analytics strategy that is fully operational and tied to business outcomes
  • Optimized Infrastructure: A trusted, scalable Fabric platform with significantly reduced manual reporting and 99%+ data availability
  • Dashboard Adoption: A suite of high-quality dashboards used daily and weekly by business leaders to drive decision-making
  • Team Growth: A high-performing global team with a track record of delivering complex analytics products with speed and precision

Physical Requirements:

  • Remaining in a seated position for long periods of time
  • Standing is to remain on one’s feet in an upright position without moving about
  • The ability to alternate between sitting and standing is present when a worker has the flexibility to choose between sitting or standing as needed when this need cannot be accommodated by schedules breaks and/or lunch period
  • Lifting and transporting items that could weight up to 25 pounds
  • Entering text or data into a computer by means of a traditional keyboard
  • Expressing or exchanging ideas by means of the spoken work to impart oral information to clients and talent and convey detailed spoken instructions to other workers accurately and quickly
  • The ability to hear, understand, and distinguish speech and/or other sounds such as in person and telephone
  • Clarity of vision to see computer screens and workspace
Not Specified
Data Engineer
✦ New
Salary not disclosed
Towson, Maryland 11 hours ago

Sr Data & BI Engineer (Hybrid)

We're partnering with a growing organization seeking a SQL-focused Data & BI Engineer to build and optimize data pipelines, support ETL processes, and drive reporting infrastructure. This role sits at the intersection of data engineering and business intelligence, with strong visibility across teams and leadership.

What You'll Do

  • Design, build, and maintain SQL-based data pipelines and transformations
  • Develop and optimize ETL processes to support reporting and analytics
  • Write performant SQL for data modeling, transformation, and downstream consumption
  • Support and enhance reporting infrastructure (SSRS → Power BI migration)
  • Partner with business and technical teams to deliver scalable data solutions
  • Improve data quality, structure, and accessibility across systems
  • Contribute to performance tuning and optimization of data workflows

What You Bring

  • Strong SQL skills with experience in data transformation and pipeline development
  • Experience with ETL tools or frameworks (SSIS or similar)
  • Exposure to BI tools such as Power BI or SSRS
  • Experience working with structured data models in a production environment
  • Ability to operate across both data engineering and reporting use cases

Environment

  • Hybrid: 3 days onsite
  • Evolving data environment with active investment in modernization
  • Transitioning reporting stack from SSRS to Power BI
  • Collaborative team with dedicated DBA support

Compensation

$120K – $140K base + bonus potential and good benefits

Not Specified
SAP S/4HANA Functional Process Data Expert
Salary not disclosed
Atlanta 3 days ago
Summary: Location: Atlanta, GA Duration: 12 Months 100% Remote – open to any area Responsibilities: Partner with global and regional business stakeholders to define data requirements aligned to standardized value stream processes.

Translate business process designs into clear master and transactional data definitions for S/4HANA.

Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.

Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).

Define data objects, attributes, and mandatory fields.

Support business rules, validations, and derivations.

Align data structures to SAP best practices and industry standards.

Support data cleansing, enrichment, and harmonization activities.

Define and validate data mapping rules from legacy systems to S/4HANA.

Participate in mock conversions, data loads, and reconciliation activities.

Ensure data quality thresholds are met prior to cutover.

Support the establishment and enforcement of global data standards and policies.

Work closely with Master Data and Data Governance teams.

Help define roles, ownership, and stewardship models for value stream data.

Contribute to data quality monitoring and remediation processes.

Support functional and integrated testing with a strong focus on data accuracy.

Validate business scenarios using migrated and created data.

Support cutover planning and execution from a data perspective.

Provide post-go-live support and stabilization.

Requirements: 5 years of SAP functional experience with a strong data focus.

Hands-on experience with SAP S/4HANA (greenfield preferred).

Proven involvement in large-scale, global ERP implementations.

Deep understanding of value stream business processes and related data objects.

Experience supporting data migration, cleansing, and validation.

Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).

Understanding of S/4HANA data model changes vs.

ECC.

Experience working with SAP MDG or similar governance tools preferred.

Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).

Ability to read and interpret functional specs and data models.

Strong stakeholder management and communication skills.

Ability to work across global, cross-functional teams.

Detail-oriented with strong analytical and problem-solving skills.

Comfortable operating in a fast-paced transformation environment.

Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.

Prior role as Functional Data Lead or Data Domain Lead.

Experience defining global templates and harmonized data models.

Knowledge of data quality tools and metrics.

Experience with MGD and setting up cost center and profit center groups.
Not Specified
Data Integration & AI Engineer
✦ New
Salary not disclosed
Edison, NJ 1 day ago

About Wakefern

Wakefern Food Corp. is the largest retailer-owned cooperative in the United States and supports its co-operative members' retail operations, trading under the ShopRite®, Price Rite®, The Fresh Grocer®, Dearborn Markets®, and Gourmet Garage® banners.


Employing an innovative approach to wholesale business services, Wakefern focuses on helping the independent retailer compete in a big business world. Providing the tools entrepreneurs need to stay a step ahead of the competition, Wakefern’s co-operative members benefit from the company’s extensive portfolio of services, including innovative technology, private label development, and best-in-class procurement practices.


The ideal candidate will have a strong background in designing, developing, and implementing complex projects, with focus on automating data processes and driving efficiency within the organization. This role requires a close collaboration with application developers, data engineers, data analysts, data scientists to ensure seamless data integration and automation across various platforms. The Data Integration & AI Engineer is responsible for identifying opportunities to automate repetitive data processes, reduce manual intervention, and improve overall data accessibility.


Essential Functions

  • Participate in the development life cycle (requirements definition, project approval, design, development, and implementation) and maintenance of the systems.
  • Implement and enforce data quality and governance standards to ensure the accuracy and consistency.
  • Provide input for project plans and timelines to align with business objectives.
  • Monitor project progress, identify risks, and implement mitigation strategies.
  • Work with cross-functional teams and ensure effective communication and collaboration.
  • Provide regular updates to the management team.
  • Follow the standards and procedures according to Architecture Review Board best practices, revising standards and procedures as requirements change and technological advancements are incorporated into the >tech_ structure.
  • Communicates and promotes the code of ethics and business conduct.
  • Ensures completion of required company compliance training programs.
  • Is trained – either through formal education or through experience – in software / hardware technologies and development methodologies.
  • Stays current through personal development and professional and industry organizations.

Responsibilities

  • Design, build, and maintain automated data pipelines and ETL processes to ensure scalability, efficiency, and reliability across data operations.
  • Develop and implement robust data integration solutions to streamline data flow between diverse systems and databases.
  • Continuously optimize data workflows and automation processes to enhance performance, scalability, and maintainability.
  • Design and develop end-to-end data solutions utilizing modern technologies, including scripting languages, databases, APIs, and cloud platforms.
  • Ensure data solutions and data sources meet quality, security, and compliance standards.
  • Monitor and troubleshoot automation workflows, proactively identifying and resolving issues to minimize downtime.
  • Provide technical training, documentation, and ongoing support to end users of data automation systems.
  • Prepare and maintain comprehensive technical documentation, including solution designs, specifications, and operational procedures.


Qualifications

  • A bachelor's degree or higher in computer science, information systems, or a related field.
  • Hands-on experience with cloud data platforms (e.g., GCP, Azure, etc.)
  • Strong knowledge and skills in data automation technologies, such as Python, SQL, ETL/ELT tools, Kafka, APIs, cloud data pipelines, etc.
  • Experience in GCP BigQuery, Dataflow, Pub/Sub, and Cloud storage.
  • Experience with workflow orchestration tools such as Cloud Composer or Airflow
  • Proficiency in iPaaS (Integration Platform as a Service) platforms, such as Boomi, SAP BTP, etc.
  • Develop and manage data integrations for AI agents, connecting them to internal and external APIs, databases, and knowledge sources to expand their capabilities.
  • Build and maintain scalable Retrieval-Augmented Generation (RAG) pipelines, including the curation and indexing of knowledge bases in vector databases (e.g., Pinecone, Vertex AI Vector Search).
  • Leverage cloud-based AI/ML platforms (e.g., Vertex AI, Azure ML) to build, train, and deploy machine learning models on a scale.
  • Establish and enforce data quality and governance standards for AI/ML datasets, ensuring the accuracy, completeness, and integrity of data used for model training and validation.
  • Collaborate closely with data scientists and machine learning engineers to understand data requirements and deliver optimized data solutions that support the entire machine learning lifecycle.
  • Hands-on experience with IBM DataStage and Alteryx is a plus.
  • Strong understanding of database design principles, including normalization, indexing, partitioning, and query optimization.
  • Ability to design and maintain efficient, scalable, and well-structured database schemas to support both analytical and transactional workloads,
  • Familiarity with BI visualization tools such as MicroStrategy, Power BI, Looker, or similar.
  • Familiarity with data modeling tools.
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Proficiency in project management software (e.g., JIRA, Clarizen, etc.)
  • Familiarity with DevOps practices for data (CI/CD pipelines)
  • Strong knowledge and skills in data management, data quality, and data governance.
  • Strong communication, collaboration, and problem-solving skills.
  • Ability to work on multiple projects and prioritize tasks effectively.
  • Ability to work independently and in a team environment.
  • Ability to learn new technologies and tools quickly.
  • The ability to handle stressful situations.
  • Highly developed business acuity and acumen.
  • Strong critical thinking and decision-making skills.


Working Conditions & Physical Demands

This position requires in-person office presence at least 4x a week.


Compensation and Benefits

The salary range for this position is $75,868 - $150,644. Placement in the range depends on several factors, including experience, skills, education, geography, and budget considerations.

Wakefern is proud to offer a comprehensive benefits package designed to support the health, well-being, and professional development of our Associates. Benefits include medical, dental, and vision coverage, life and disability insurance, a 401(k) retirement plan with company match & annual company contribution, paid time off, holidays, and parental leave.


Associates also enjoy access to wellness and family support programs, fitness reimbursement, educational and training opportunities through our corporate university, and a collaborative, team-oriented work environment. Many of these benefits are fully or partially funded by the company, with some subject to eligibility requirements

Not Specified
Sr. Full Stack Engineer
Salary not disclosed
Fairfax, VA 3 days ago


Sr. Full Stack Engineer

Job ID

2025-2140

# of Openings

1

Overview

Currently seeking multiple Full Stack Developers in support of the of U.S. Citizenship and Immigration Services (USCIS) Engineering Support for Identity Services (ESIS), this individual will support Agile Application Development technologies and capabilities in the areas of software development, systems engineering, integration, and test of software applications and infrastructure. Will be skilled with front-end, back-end, and database development. Design and implement full stack cloud solutions to include IaaS, PaaS, and SaaS. Design and deploy computing infrastructure, physical or virtual machines and other resources like virtual-machine disk image library, block and file-based storage, firewalls, load balancers, IP addresses, virtual local area networks. Implement cloud-based platform services for AWS. Implement cloud-based software as service for AWS. Perform DevOps functions.

Key Skills:

  • 10+ years of experience with full stack engineering with proficiency in database development/integration as well as server and client application development/integration
  • Software developing experience using Python and Java Spring framework
  • Experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
  • AWS experience required with experience deploying enterprise applications in AWS
  • Experience with CI/CD environment tools such as Docker, Jenkins, Ansible, Kubernetes


Responsibilities

  • Software development with Python, Java, React, and various scripting languages
  • Design data models and web APIs and creation of software tasks from system requirements
  • Perform requirements analysis, design, development, unit, and integration testing of software, troubleshooting and debugging of the system
  • Immediate responsibilities will include enhancing and maintaining the existing system as well as design, development, and documentation of new features
  • Create Git Releases, pull request and code reviews
  • Query logs utilizing Splunk and will monitor dashboarding utilizing New Relic
  • Usage of Atlassian Tools for day to day tasks within the Scrum process
  • Implement web services, data persistence access features and external interfaces
  • Partner closely with front-end and database engineers to ensure features are developed holistically
  • Follow Agile software development methodology and team architecture standards.
  • Will need to be able to read Architecture Diagrams
  • Perform test service to improve code coverage, mocking services, test driven development and unit testing
  • Will modify Helm Charts, Jenkinsfiles, and Dockerfiles


Qualifications

  • MUST BE US CITIZEN
  • Bachelor's degree required
  • Must be able to obtain and maintain a Public Trust security clearance
  • 10+ years expereince in Software Engineering
  • Must have experience in Python and Java Spring Framework (Boot, Batch, Data, Security)
  • Must have experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
  • Experience with design, development, enhancement, troubleshooting and debugging of web applications
  • Must have experience in AWS cloud environment and with CI/CD tools (ie. Docker, Jenkins, Kubernetes) for deployment processes, monitoring production environments, and modifying docker/Jenkins files and helm charts
  • Experience with scripting languages (Python, Bash, Powershell, Perl) is not required but nice to have
  • Understanding of the concept of branching and utilizing technological tools such as Git, VS Code, and/or Rancher to perform
  • Experience with creating Git releases, creating pull requests, and reviewing code
  • Experience monitoring dashboards utilizing New Relic
  • Experience with Splunk to query logs
  • Experience with Junit testing preferred
  • Experience creating release instructions utilizing JIRA
  • Experience developing and integrating complex software systems through the full SDLC
  • Experience with Agile Scrum
  • Must have strong written and verbal communication skills


Target Pay Range

The below listed pay range for this position is not a guarantee of compensation or salary. The final offered salary will be influenced by a host of factors including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at Pyramid Systems that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits, to include our Employee Stock Ownership Program, FlexPTO, and learning and development opportunities.

Pyramid Min

USD $125,731.00/Yr.

Pyramid Max

USD $188,597.00/Yr.

Why Pyramid?

Pyramid Systems, Inc. is an award-winning, technology leader, driving digital transformation across federal agencies. We empower forward-thinking innovations, accelerate production-ready software, and deliver secure solutions so federal agencies can meet their mission goals. Voted a Top Workplace, both regionally (Washington, DC) and Nationally (USA) the past 2 years (2023 and 2024) based on the feedback from our employees, we are headquartered in Fairfax, VA. and have a growing national footprint. We value and promote our Flexible Workplace approach because of the positive impacts it has on work-life integration. We remain committed to ensuring every employee's voice is heard, performance and results are recognized and rewarded, development and advancement is a focus, and diversity, equity and inclusion is a company priority. We offer competitive compensation and benefits (including a recently launched Employee Stock Ownership Plan - ESOP), a robust performance-based rewards program, and we know how to have fun! Our people and culture have endured and delivered for our clients for nearly three decades.

EEO Statement

Pyramid Systems, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.

permanent
Data Analyst
✦ New
Salary not disclosed
Des Moines, IA 5 hours ago

This is a full-time position that requires onsite presence in Des Moines, Iowa. Candidates must be authorized to work in the United States without sponsorship now or in the future.


P3+Uplift is partnering with a local insurance company to find a SQL-driven Data Analyst who enjoys working directly with business stakeholders to turn data questions into clear insights and reporting. This role is highly hands-on with SQL and data extraction, working across multiple data sources to support reporting, analysis, and data-driven decision making. The ideal candidate is both analytical and consultative—able to understand business needs, write efficient queries, and deliver clear, actionable insights.


The company offers a flexible schedule, hybrid work environment, casual dress code, and a collaborative culture, plus a comprehensive benefits package.


Key Responsibilities

  • Write and optimize SQL queries to pull and analyze data from multiple sources.
  • Partner with business teams to clarify questions, define metrics, and deliver actionable insights.
  • Build and maintain interactive reports and dashboards to support decision-making (Power BI preferred).
  • Ensure data accuracy through validation, cleansing, and reconciliation.
  • Document data sources, definitions, and analysis logic to create repeatable, reliable reporting processes.
  • Identify opportunities to streamline data workflows, improve automation, and enhance reporting efficiency.
  • Communicate findings and trends in clear, business-friendly language to stakeholders.
  • Contribute to ad-hoc analysis projects, providing insights to guide business strategy.


5+ years experience:

  • Strong SQL experience required with the ability to query and analyze large datasets.
  • Experience working with data structures, relational databases, and multiple data sources.
  • Experience with data validation, cleansing, and quality assurance.
  • Experience with Power BI or other data visualization tools preferred.
  • Ability to translate complex data into clear, business-friendly insights.
  • Strong communication skills and a consultative approach with stakeholders.


Education: Bachelor’s degree in Business, Analytics, Statistics, or a related field, or equivalent experience

Not Specified
Azure Data Engineer
✦ New
Salary not disclosed
Queens 1 day ago
Job Description : We are seeking a hands-on Consultant with strong Azure ETL experience and advanced Power BI development skills.

They are required to have experience modernizing legacy Microsoft BI environments (including SSIS).

This is not an SSIS-only role.

The consultant will design, modernize, and enhance enterprise data and analytics solutions supporting Cyber Security, Physical Security, Electronic Security and Police operations.

This role includes evolving legacy SQL Server/SSIS-based processes into modern Azure data architectures while designing scalable new ETL/ELT pipelines and delivering executive-level analytics solutions.

The consultant will work directly with stakeholders to deliver production-grade reporting and analytics capabilities across multiple enterprise systems.

This requires architectural thinking and hands-on technical execution.

Core Responsibilities: Candidates must have direct experience building enterprise-grade ETL pipelines and executive Power BI dashboards.

Design and implement modern ETL/ELT pipelines in Azure Assess and refactor existing SSIS packages as part of broader modernization efforts Architect Lakehouse / Medallion data models Develop optimized dimensional data models (star schema) Integrate data from SQL Server, Oracle, APIs, and security platforms Design and deploy enterprise Power BI dashboards Build paginated reports using Power BI Report Builder Optimize DAX and dataset performance Implement Row-Level Security (RLS) Support CI/CD and DevOps deployment processes Produce technical documentation and data lineage artifacts Engage directly with executive stakeholders Required Technical Skills: (Must-Have) Data Engineering & Architecture: Strong ETL/ELT design and optimization experience Advanced SQL (expert-level required) Python / PySpark Dimensional data modeling (star schema required) REST API integrations Azure Data Stack: • Azure Data Factory • Azure Databricks • Azure Synapse Analytics • Azure Data Lake Storage Microsoft Data Platform: • Experience with SQL Server data warehouse environments • Working knowledge of SSIS and experience modernizing or migrating SSIS workflows to Azure-based solutions Power BI: Power BI Desktop (expert-level) Advanced DAX Executive dashboard development Paginated reports (Power BI Report Builder) Data Gateway configuration Incremental refresh Row-Level Security (RLS) Nice to Have: Microsoft Purview Terraform (Infrastructure-as-Code) Orchestration tools (Airflow or equivalent) Security systems data integration experience Experience with C# / .NET web application development (for integration with internal systems or APIs) Experience Requirements: 7+ years of hands-on data engineering / analytics delivery Demonstrated experience building production data pipelines in Azure Proven experience delivering executive-facing Power BI solutions Experience working in complex enterprise environments Software Skills: 4–6 years of experience in Azure for building, deploying, and managing cloud-based data and application services.

Technical Skills: 2–4 years of experience in .NET code development for developing and maintaining enterprise applications and data processing components.

6+ years of experience in Data Modeling including designing logical and physical data models for enterprise data warehouses and analytics systems.

6+ years of experience in Python scripting for data processing, automation, ETL development, and data transformation tasks.

6+ years of experience in Structured Query Language (SQL) for writing complex queries, stored procedures, performance tuning, and data manipulation.
Not Specified
Electrical Engineer - Data Centers
✦ New
Salary not disclosed
San Francisco, CA 11 hours ago

Electrical Engineer - Data Centers - San Francisco


Metric DCX are partnered with a global engineering and consultancy firm to support the continued growth of their data center division.


This Electrical Engineer position will specialize in data center facility design to be embedded directly with a major end-user client.


Responsibilities:

  • Assessing third-party and colocation facilities being considered for acquisition, evaluating their suitability against the client's portfolio requirements.
  • Taking ownership of power systems across all project phases, identifying and resolving issues as they arise in collaboration with the relevant client stakeholders.
  • Reviewing data center designs with a critical eye on redundancy architecture, availability targets, and potential single points of failure.
  • Working closely with operations, planning, and energy strategy teams to push electrical solutions forward on third-party data center projects.
  • Conducting technical due diligence and maintaining quality standards in line with client expectations.
  • Keeping internal documentation, specs, and standards current based on live project feedback and lessons learned.
  • Liaising with internal teams on power loading, rack deployment, and load balancing within shared facilities.
  • Contributing to cross-discipline coordination with mechanical and controls engineers, and supporting consistency across regional teams.


Background Required

  • Degree-qualified in Electrical Engineering; a postgraduate qualification or PE license would be a strong advantage.
  • At least five years working within mission-critical environments, with solid hands-on exposure to colocation and multi-tenant data center projects specifically.
  • Confident in power systems analysis and the software tools that come with it.
  • Practical experience across the full electrical distribution stack — from high voltage transformers down to branch circuits — covering design, procurement, commissioning, and operations.
  • Comfortable working across disciplines and engaging with structural, mechanical, civil, and IT/Telecom teams as needed.
  • Grounded in US electrical codes and standards, with some awareness of IEC standards beneficial.
Not Specified
jobs by JobLookup
✓ All jobs loaded