Prometheus Sql Examples Jobs in Usa
1,846 positions found — Page 4
Many candidates spend months learning frameworks and finishing courses, only to freeze during technical screens, system questions, or behavioral rounds.
The result is painful: "almost hired” over and over again, while the confidence drops.
The truth is that interviewing is its own skill, and most bootcamps don't teach it deeply.
They teach how to code—but not how to think out loud, structure answers, debug in real time, defend trade-offs, and communicate like an engineer.
Since 2010, SynergisticIT has helped candidates land full-time roles with many major employers.
The best way to understand this: you can be smart and still fail interviews if you don't know what the interview is truly measuring.
Interviews rarely test "can you write code at home.” They test: Can you solve problems under constraints and time pressure? Can you communicate your approach clearly? Can you handle edge cases and complexity? Can you explain trade-offs and design choices? Can you show job-ready project depth, not just toy examples? SynergisticIT focuses on roles such as entry-level software programmers, Java full stack developers, Python/Java developers, Data Analysts, Data Engineers, Data Scientists, and Machine Learning Engineers.
The focus areas include Java / Full Stack / DevOps and Data tracks like Data Engineering, Data Analytics/BI, ML/AI, because those are the roles employers continue to hire for.
If your pattern is "I reach interviews but don't clear them,” you likely need three upgrades: Stronger project narratives (what you built, why it matters, how it works) Stronger technical foundations (DSA, OOP, APIs, SQL, pipeline design) Mock interview reps (realistic simulation, feedback, improvement loops) Many jobseekers underestimate how much hiring is about clarity.
You don't need to be perfect—you need to show you can think, collaborate, and deliver.
That's why guided mock interviews and structured interview coaching can be a game-changer.
Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Ideal candidates for this version include: Candidates who get interviews but repeatedly fall short Jobseekers stuck in "screen round limbo” Developers who panic during live coding Candidates who can build projects but struggle to explain them Professionals who haven't interviewed in years and feel rusty Career changers who fear "I'm behind CS grads” (often untrue with support) SynergisticIT provides support for candidates navigating STEM OPT extension, H1B filing, and Green Card processes (where applicable), which can matter when timing is critical.
Event videos (OCW, JavaOne, Gartner): USA Today feature If you're tired of failing interviews and want a structured plan to convert interviews into offers, start here: Contact SynergisticIT: Because getting hired isn't about trying harder—it's about preparing smarter, practicing correctly, and having the right guidance.
Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume.
Get Hired by taking action.
If you just graduated (or you're about to) and the job search is already feeling confusing, you're not imagining it.
A degree proves you can learn—but employers hire for job readiness: projects that look like real work, current tech stacks, interview confidence, and the ability to contribute on day one.
That's why many new grads send hundreds of applications and still hear nothing back.
It's not because you're "not smart enough.” It's because most entry-level pipelines are crowded, and hiring teams filter heavily for candidates who look production-ready.
We are actively considering candidates for entry-level software engineering and data roles, especially Java full stack, Java/Python development, DevOps automation, data analytics, data engineering, data science, and ML/AI—full-time opportunities aligned to client needs.
Our core emphasis remains Java/Full Stack/DevOps and Data/Analytics/Engineering/ML.
SynergisticIT focuses on two high-demand lanes: Java / Full Stack / DevOps and Data (Data Analyst, Data Engineer, Data Scientist) + ML/AI—so you don't graduate with scattered skills, you graduate with an employable stack.
SynergisticIT since 2010, has helped candidates land full-time roles at major organizations (examples often cited include Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Banking, Wayfair, Client, Client, and more) with offers commonly in the $95k–$154k range depending on role and skill depth.
For a new grad, the bigger message isn't the number—it's that results require a structured pathway, not random applications.
Here's a realistic way to think about your advantage as a fresh graduate: you're early enough to build the right foundation before bad habits set in.
If you master fundamentals—coding, debugging, data structures, system thinking—and then layer modern tools on top (frameworks, cloud, CI/CD, analytics stacks), you become the kind of "entry-level” candidate who actually feels like a safe hire.
What roles are companies hiring for right now? A typical market demand pattern is clear: organizations still need entry-level software programmers, Java full stack developers, Python/Java developers, DevOps-focused engineers, and on the data side data analysts, BI analysts, data engineers, data scientists, and machine learning engineers.
The strongest candidates aren't "tool collectors”—they're people who can show end-to-end capability: build an API, connect a database, deploy a service, analyze data, explain results, and handle interviews calmly.
Why fresh grads get stuck— Fresh grads often struggle for four predictable reasons: Resume doesn't match job keywords (ATS filters you out).
Projects look like school assignments (not production-aligned).
Interview skills are undertrained (DSA, system design, SQL, behavioral).
No structured pipeline (random applying without feedback loops).
A job-placement-first approach addresses these systematically: build the right portfolio, practice the right interview questions, align your tech stack to roles, and keep improving until the market says "yes.” Who this path fits best If you're a recent graduate, you'll likely fit if you match any of these: New grads in CS, Engineering, Math, or Statistics with limited job experience Students finishing Bachelor's or Master's programs who need a real hiring plan Candidates who apply consistently but don't get callbacks Candidates who reach interviews but struggle to close International students on F-1/OPT who need a job plan for STEM extension/H-1B timing Graduates with strong academics but thin practical experience SynergisticIT helps STEM extension and work authorization pathways, and for candidates who need long-term stability, support related to H-1B and green card processes as part of employer-side realities.
If you're tired of guessing, stop treating your job search like a lottery.
Treat it like a project with milestones: skills → portfolio → interview readiness → targeted applications → scheduled interviews → offer.
If you want to explore, here are the key links: Event videos (OCW, JavaOne, Gartner): USA Today feature Contact & get a roadmap: Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Bottom line for fresh grads: Your degree is the starting line, not the finish line.
If you want to get hired faster, you don't need "more random courses.” You need a guided, job-focused path and the right people around you.
In tech, it's not just what you learn—it's how you learn and who you build with that decides how far you go.
Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume
Position title:
Health & Safety Net Researcher
Salary range:
The UC academic salary scales set the minimum pay determined by the rank and step at appointment. See the following table(s) for the current salary scale(s) for this position: . The current full-time base salary range for this position is $76,500 - $197,700.
Off-scale salaries, which yield compensation that is higher than the published system-wide salary at the designated rank and step, are offered when necessary to meet competitive conditions.
Percent time:
100
Anticipated start:
December 2025
Position duration:
Two years
Application Window
Open date: October 20, 2025
Most recent review date: Monday, Nov 17, 2025 at 11:59pm (Pacific Time)
Applications received after this date will be reviewed by the search committee if the position has not yet been filled.
Final date: Monday, Apr 20, 2026 at 11:59pm (Pacific Time)
Applications will continue to be accepted until this date, but those received after the review date will only be considered if the position has not yet been filled.
Position description
Department Overview
The California Policy Lab (CPL) generates research insights for government impact. Through hands-on partnerships with government agencies, CPL performs rigorous research across issue silos and builds the data infrastructure necessary to improve programs and policies that millions of Californians rely on every day. We work on California's most urgent issues, including homelessness, poverty, criminal justice reform, and education inequality. At its Berkeley site, CPL resides as a center within the Institute for Research on Labor and Employment (IRLE). CPL recognizes the value of having a diverse staff at all levels of the organization. When you join our team, you can expect to be part of an inclusive and equity-focused community.
Position Description
The Health & Safety Net Researcher will lead CPL's research portfolio relating to the social safety net and health, in collaboration with and under the oversight of CPL's faculty affiliates and Research Director. This is an exciting role for a skilled safety net and/or health researcher who wants to design and conduct policy-relevant quantitative research in partnership with state and local agencies throughout California. The Health & Safety Net Researcher conducts quantitative research, including conceptualizing research questions and design; requesting, receiving, and cleaning data files; creating and implementing an analysis plan; conducting quality assurance reviews; summarizing results in documents for both academic and policy audiences; and generating replicability documentation. The Health & Safety Net Researcher can implement multiple research designs and analysis techniques, including but not limited to randomized control trials, quasi-experimental designs using natural experiments or other such variation, and difference-in-difference and event study analyses. Further, a successful candidate can lead research projects under the oversight of CPL's Research Director with small teams, write proposals to support research projects, and ensure timely and high-quality completion of research tasks. The Health & Safety Net Researcher will mentor and supervise other research staff. The position will report to the Research Director and will work directly with leading social policy researchers at UC and other top universities, state and local government agency staff, and CPL's leadership team.
Position Responsibilities
- Under the oversight of the Research Director, conduct quantitative research, including conceptualizing research questions and design; requesting, receiving, and cleaning data files; creating and implementing analysis plans; conducting quality assurance reviews; summarizing results in documents for both academic and policy audiences; and generating replicability documentation.
- Annually publish multiple reports and policy briefs based on research.
- Implement multiple research designs and analysis techniques, including but not limited to randomized control trials, quasi-experimental designs using natural experiments or other such variation, and difference-in-difference and event study analyses.
- Lead research projects with small teams, write proposals to support research projects, and ensure timely and high-quality completion of research tasks.
- Supervise other research staff, provide feedback on performance.
- Partner effectively with state and local agency staff to build a joint research agenda.
Conviction History Background
This is a designated position requiring fingerprinting and a background check due to the nature of the job responsibilities. Berkeley does hire people with conviction histories and reviews information received in the context of the job responsibilities. The University reserves the right to make employment contingent upon successful completion of the background check.
Department:
Unit:
Qualifications
Basic qualifications (required at time of application)
Bachelor's degree or equivalent international degree
Additional qualifications (required at time of start)
Associate Specialist
Bachelor's degree or equivalent international degree and at least five (5) years of professional experience
OR
Master's degree or equivalent international degree and at least three (3) years of professional experience
Full Specialist
Bachelor's degree or equivalent international degree and at least ten (10) years of professional experience
OR
Master's degree or equivalent international degree and at least eight (8) years of professional experience
OR
PhD or equivalent international degree and at least two (2) years of professional experience.
The California Policy Lab is unable to offer visa sponsorship for these positions.
Preferred qualifications
- Ph.D. in economics, public policy, or related social science field, or equivalent experience.
- Five or more years of post-Ph.D. experience managing projects as a Principal Investigator and supervising project staff.
- Training and experience that clearly demonstrates qualifications.
- Fluency in data-analysis packages from commonly used programming languages like Python, R, Stata, SQL, or SAS.
- Expertise in one of CPL's policy areas.
- Strong interpersonal and communication skills and ability to work both independently and as a team member.
- Strong organizational skills and attention to detail and ability to multi-task with demanding timeframes.
- Record of independent research and publication, including leading research design and analysis and writing for both academic and policy audiences.
- Experience writing grant proposals, communicating with funders, and managing grant reporting.
- Expertise in social safety net and/or health policy research.
- Experience working with large and complex administrative datasets, including data linkage techniques.
- Knowledge of data management systems, practices, and standards and ability to work discreetly with sensitive and confidential data, and experience with GitHub.
- Experience collaborating with government agency partners.
Application Requirements
Document requirements
Curriculum Vitae - Your most recently updated C.V.
Cover Letter
Research Statement - Please discuss research accomplishments and proposed plans. This can include, for example, your publication record, awards, presentations, inclusive research practices that promote the excellence of your research, and areas for future research.
(Optional)Writing Sample - One or more illustrative examples of the candidate's research - a "job market paper," a manuscript, or other research product suitable to the candidate's field. The candidate should be the primary author.
Reference requirements
- 3-5 required (contact information only)
Apply link:
JPF05157
Help contact:
About UC Berkeley
UC Berkeley is committed to diversity, equity, inclusion, and belonging in our public mission of research, teaching, and service, consistent with UC Regents Policy 4400 and University of California Academic Personnel policy (APM 210 1-d). These values are embedded in our Principles of Community, which reflect our passion for critical inquiry, debate, discovery and innovation, and our deep commitment to contributing to a better world. Every member of the UC Berkeley community has a role in sustaining a safe, caring and humane environment in which these values can thrive.
The University of California, Berkeley is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, age, or protected veteran status.
For more information, please refer to the University of California's Affirmative Action and Nondiscrimination in Employment Policy and the University of California's Anti-Discrimination Policy.
In searches when letters of reference are required all letters will be treated as confidential per University of California policy and California state law. Please refer potential referees, including when letters are provided via a third party (i.e., dossier service or career center), to the UC Berkeley statement of confidentiality prior to submitting their letter.
As a University employee, you will be required to comply with all applicable University policies and/or collective bargaining agreements, as may be amended from time to time. Federal, state, or local government directives may impose additional requirements.
Unless stated otherwise, unambiguously, in the position description, this position does not include sponsorship of a new consular H-1B visa petition that would require payment of the $100,000 supplemental fee.
As a condition of employment, the finalist will be required to disclose if they are subject to any final administrative or judicial decisions within the last seven years determining that they committed any misconduct.
- "Misconduct" means any violation of the policies or laws governing conduct at the applicant's previous place of employment, including, but not limited to, violations of policies or laws prohibiting sexual harassment, sexual assault, or other forms of harassment or discrimination, as defined by the employer.
- UC Sexual Violence and Sexual Harassment Policy
- UC Anti-Discrimination Policy
- APM - 035: Affirmative Action and Nondiscrimination in Employment
Job location
Berkeley, CA with an expectation of 3 days a week in the office
OFFICIAL JOB TITLE: Institutional Research Associate
DIVISION: Academic Affairs
DEPARTMENT: Institutional Research and Assessment
BARGAINING UNIT STATUS: ESU, Cat 12
REPORTS TO: Director of Institutional Research and Assessment
SUPERVISES: N/A
SUMMARY PURPOSE OF POSITION: Supports institutional research functions in data extraction, interpretation, and presentation and helps implement a variety of data inquiries and information collection activities; extracts data from complex files and systems including the campus enterprise system (across its functional areas including students, human resources, finances, and other system areas) and Institutional Research and UMass databases; translates extracted data into other data structures for specific reports (such as the HEIRS data system for reporting unit data to the Massachusetts Department of Higher Education); applies expertise in data interpretation ensuring consistent and auditable results and interacts with campus staff and representatives of external organizations to validate existing data, obtain new data, and support information and data needs; reports data extraction and inquiry results in a variety of complex formats, including in-house publications and reports, ad-hoc reports or inquiries, and external surveys or other external data reports that can be at the federal or state levels.
EXAMPLES OF PRIMARY DUTIES AND RESPONSIBILITIES:
- Extracts data from complex files and systems including the campus enterprise system and creates and maintains SPSS databases and dictionaries.
- Translates extracted data into other data structures for specific reports, such as those for the comprehensive HEIRS data system for reporting unit data to the Massachusetts Department of Higher Education.
- Applies expertise in data interpretation using the enterprise information system (PeopleSoft) and IR/UMass databases, ensuring consistent and auditable results across the functional areas of the enterprise system (including students, human resources, finances, and other system areas). Interpretation includes accurate application of data definitions/look-up tables and the application of subtotals, percentages, averages, percentiles, and similar functions.
- Interacts with campus staff and representatives of external organizations to validate existing data and obtain new data and supports information and data needs for other campus units as assigned.
- Reports data extraction and inquiry results in a variety of complex formats, including in-house publications and reports, ad-hoc reports or inquiries, and external surveys, especially for college handbooks and the Common Data Set, and for other external data reports that can be at the federal or state levels.
- Performs other related duties as assigned.
MINIMUM QUALIFICATIONS:
- EDUCATION: Bachelor's Degree in a related field
- EXPERIENCE: Demonstrated (over 3 years) experience in the general field of programming, data extraction, and reporting.
- OTHER: May need to be available for evening/weekend hours as needed; some travel to off-campus locations may occur
PREFERRED QUALIFICATIONS:
- Experience in a college/university or other institutional setting performing management reporting and analysis
- Experience in a higher education setting
- Experience with enterprise system reporting and analysis
- Experience in using the PeopleSoft enterprise information system and the OBIEE inquiry tool
- Experience working in a data warehouse environment
KNOWLEDGE, SKILLS AND ABILITIES REQUIRED:
- Ability to learn data values and interrelationships in a comprehensive enterprise information system and apply them in a range of specific contexts and reports.
- Demonstrated database and reporting skills including SQL queries, OBIEE or related reporting tool, and SPSS and/or Excel to perform analyses and general reports.
- Ability to provide professional-level accuracy and precision of interpretation of complex sets of data.
- Able to manage multiple competing priories and projects
- Knowledge of basic statistical applications.
SALARY: $52,756.00 to $65,945.20
UMass Dartmouth offers exciting benefits such as:
- 75% Employer-Paid Health Insurance
- Flexible Spending Accounts
- Life Insurance
- Long Term Disability
- State Pension Retirement Plan
- Optional Retirement Savings Plans
- Tuition Credit (Employee, Spouse, & Dependents)
- Twelve (12) paid holidays
- Paid personal, vacation, and sick time
- And More!
Benefits for ESU Union: ESU
Applicants must be authorized for employment in the U.S. on a full-time basis. Employment-based visa sponsorship is not available.
To apply please submit a letter of interest, a current resume and the contact information for three professional references.
The posting deadline for early consideration for internal ESU applicants is January 30, 2026.
The review of applications will be ongoing until the position is filled.
IT Analyst Mid Level – Epic MyChart / Digital Consumer Experience
Hybrid or remote with occasional travel in | Healthcare Technology | Contract-to-Hire
We are seeking an Epic MyChart Certified IT Analyst Senior to support a growing Digital Consumer Experience team focused on Epic MyChart and patient-facing technologies. This role is ideal for someone who combines strong healthcare application support experience with Epic MyChart expertise, integrations, and digital patient engagement tools.
You’ll work in a collaborative Agile environment supporting and enhancing applications that directly impact the patient experience, including MyChart, telehealth workflows, patient messaging, and digital care pathways.
This position plays a key role in analyzing requirements, designing solutions, supporting integrations, and improving digital healthcare workflows across multiple Epic consumer-facing applications.
Key Experience We’re Looking For
Candidates with experience in Epic MyChart and digital patient engagement platforms will stand out, particularly in the following areas:
Epic MyChart & Digital Consumer Applications
- Epic MyChart and MyChart Mobile
- MyChart Care Companion configuration and workflow management
- Epic Hello World
- Patient messaging workflows and monitoring
Integrations & Digital Health Connectivity
- SMART on FHIR app integrations
- Care Everywhere awareness
- MyChart Central and Share Everywhere
- Third-party integrations (telehealth, billing, CRM platforms)
Telehealth & Video Visits
- Video visit workflow configuration
- Troubleshooting connectivity issues
- Device readiness (camera/microphone validation)
- Video visit scheduling and configuration
Monitoring & Reporting
- Monitoring patient message volume and workflow performance
- Root cause analysis of system failures
- Adjusting build/configuration to improve user experience
- Collaboration with marketing, access, and digital teams
MyChart Care Companion
- Building and maintaining care pathways
- Configuring tasks, questionnaires, and educational content
- Managing reminders, notifications, and escalations
- Outcome tracking and patient engagement analytics
- Workflow testing, validation, and ongoing maintenance
Digital Experience Platforms
- Physician intranet widgets and digital content configuration
- MyChart intranet updates, knowledge resources, and training materials
- Collaboration with internal teams to support digital engagement strategies
Role Responsibilities
Working within Agile and other IT frameworks, the IT Analyst Senior will:
- Partner with stakeholders to gather, analyze, and document business and technical requirements
- Support and enhance Epic and healthcare applications
- Troubleshoot and resolve application issues using strong analytical and root cause analysis skills
- Lead application upgrades and project initiatives
- Design and implement solutions across the software development lifecycle
- Maintain vendor-supported application versions
- Collaborate with vendors on complex escalations
- Maintain application infrastructure health including patching and system maintenance
- Provide documentation, training, and knowledge sharing across teams
- Participate in on-call rotations for application support
- Mentor junior team members and facilitate knowledge sharing
Required Qualifications
Education
- Associate’s degree or equivalent experience required
- Bachelor’s degree preferred
Experience
- 5+ years of IT or healthcare application support experience
- Experience supporting Epic or healthcare technology platforms strongly preferred
Certifications (Preferred)
Candidates may be asked to obtain certifications within one year of hire.
Examples include:
- Epic Certification (MyChart)
- ITIL Certification
- CompTIA A+
- SQL Certification
- Certified Scrum Developer (CSD)
- OnBase Certification
- RHIT / RHIA
- CAHIMS
- 3M 360 Systems Administrator
Core Competencies
Successful candidates will demonstrate:
- Strong communication and stakeholder collaboration
- Analytical thinking and problem solving
- Adaptability in fast-paced Agile environments
- Ability to translate technical and business requirements into practical solutions
- A collaborative mindset focused on continuous improvement
If you have experience with Epic MyChart, patient engagement tools, and healthcare application integrations, this is an opportunity to play a meaningful role in improving the digital healthcare experience for patients and providers.
The intention is to fill this position in the gap between the incumbent DBA leaves, and the position is backfilled, which will take several months.
We cannot afford to not have a resource working on the large and complex applications that we support, including several Tier 1 level applications that belong in the Bureau of Labs, Cancer Registry etc.
As an intermediate-level Database Administrator, this resource will participate in 24x7 software and hardware support for complex applications in several versions of SQL Server, with high availability and Disaster Recovery support, following industry and DTMB development standards.
They will; · Help develop and/or submit for approval, plan for installation, patch management maintenance, upgrades, and support for database systems · Evaluate impacts of change and new technology, recommend solutions to persistent problems, and serve as an Agency Services liaison to external consultants · Follow and enforce database standards, policies, and procedures · Research and draft guidelines within the boundaries of current policies and standards · Monitor space allocation across databases, and perform adjustments in test and development environments, as necessary and as prescribed by predefined standards/guidelines · Calculate disk space requirements for existing and/or new installations of existing business needs · Modify DBMS parameters based on capacity changes · Configure and execute database integrity checks · Monitor for database integrity checks · Install database base management software for development and test environments, and patches and service packs for development and test environments · Monitor and support clustered database environments · Monitor and support database replication and backup environments · Implement strategy to release unused space or repair fragmentation in test and development environments · Execute (run) scripts provided by Systems Analysts or Database Architects/Designers for creating and modifying database objects (tables, views, constraints, indexes, etc.) · Monitor database back-ups to ensure recoverability; troubleshoot backup errors · Monitor database jobs and scheduled processes in development, test, and production environments; participate in troubleshooting · Monitor database environments (using alert logs, trace files, alert mechanisms, and other tools) for issues and problems with database functionality, connectivity, or downtime · Follow standards and guidelines for database space allocation based on best practices and implementation considerations based on business requirements Job Qualifications: · 4+ years of database administration experience, specifically on SQL Server 16 and upwards · 2+ years of experience creating, updating, and maintaining systems documentation · Expertise in HA and DR solutions · Experience with Transparent Data Encryption within SQL Server · A minimum of a Bachelor's Degree in Computer Science, Information Systems, or other relevant field required .
Remote working/work at home options are available for this role.
For over 95 years, we have cared for our employees and customers, which is why we rank as the 19th largest privately-owned builder in the country and have a history of long-tenured employees.
We are proud to be named a U.S.
Best Managed Company in 2022, 2023, 2024, and 2025, a program sponsored by Deloitte Private and The Wall Street Journal, and to be officially certified as a Great Place to Work for the last three years.
Enrich your career at a company that values integrity, excellence, opportunity, stability, and success.
?? Headquartered in Fort Mitchell, Kentucky, Drees operates in twelve metropolitan areas: Greater Cincinnati (including Northern Kentucky), Cleveland, and Columbus, Ohio; Austin, Dallas, Houston, and San Antonio, Texas; Indianapolis, Indiana; Jacksonville, Florida; Nashville, Tennessee; Raleigh, North Carolina; and Washington, D.C.
Responsibilities Drees Homes is seeking a detail-oriented and analytical Business Analyst
- Data & Analytics to become a valued member of our team.
This role is ideal for someone passionate about transforming data into actionable insights that drive strategic decisions.
The ideal candidate will have strong technical skills in SQL, Power BI, and ETL processes, along with the ability to communicate findings clearly and collaborate across departments.
You will play a key role in shaping data-driven strategies by developing reports, dashboards, and documentation that support business goals.
?? Key Responsibilities: Data Analysis: Analyze complex datasets to uncover trends, patterns, and actionable insights.
SQL Development: Write and optimize SQL queries to extract, transform, and manipulate data from various databases.
Power BI Reporting: Design, develop, and maintain interactive dashboards and visual reports using Power BI.
Report Requirements Gathering: Collaborate with stakeholders to gather and document detailed reporting requirements.
ETL Processes: Design, implement, and manage ETL workflows to ensure data accuracy, consistency, and availability.
API Connections: Integrate and manage data from external systems using API connections to enhance data accessibility and automation.
Documentation: Create and maintain comprehensive documentation for data processes, methodologies, and analytical findings.
Cross-functional Collaboration: Work closely with business units to understand data needs and provide analytical support.
Ad-hoc and Scheduled Reporting: Generate regular and on-demand reports to support business decision-making.
Required Skills: Bachelor???s degree in Data Science, Computer Science, Statistics, or a related field.
Proven experience as a Data Analyst or in a similar analytical role.
Proficiency in SQL and Power BI.
Strong understanding of ETL processes and data warehousing concepts.
Knowledgeable in database systems including Oracle, AWS, and Azure.
Excellent analytical, problem-solving, and documentation skills.
Strong communication skills and the ability to work collaboratively in a team environment.
MUST be eligible to work in the US without sponsorship.
?? Premier Benefits to Support YOU -?? We offer a comprehensive benefits package, including: Medical, dental and vision Life, AD&D, and critical illness insurance Wellness rewards 401(k) savings plan Profit Sharing Paid time off increasing with tenure Tuition reimbursement Long and short disability and Parental leave Employee discount program on the purchase of a Drees Home Employee Assistance Program and much more! ?? Join a special team that works together to make Drees a successful company and a rewarding place to work! ?? ?? ?? Qualifications Equal Opportunity Employer / Drug-Free Work Place ?? To learn more about Drees Homes, please visit our website
- PI283053773
Certifications: Relevant Pega certifications are required (e.g., Certified Pega Business Architect, Certified Pega System Architect).
Technical Skills: Python: Strong proficiency in Python for scripting and automation tasks, with experience in integrating Python solutions within Pega applications.
SQL: Solid experience with SQL for database management and querying, including the ability to write complex queries and optimize database performance.
Apache Airflow (Optional): Experience with Apache Airflow for orchestrating complex workflows is a plus but not mandatory.
Responsibilities Develop and implement solutions using Pega CDH to enhance customer engagement strategies.
Collaborate with cross-functional teams to design and optimize workflows and decisioning processes.
Utilize Python and SQL to support data-driven decision-making and application enhancements.
Optionally, leverage Apache Airflow for efficient workflow automation and scheduling.
Additional Skills: Strong problem-solving abilities and attention to detail.
Excellent communication skills for effective collaboration with team members and stakeholders.
Ability to thrive in a fast-paced, dynamic environment and adapt to evolving project requirements.
Python, SQL, Pega, CDH
Compensation: $130-155k Responsibilities:
- Participate in full lifecycle development of SDLC and implement all DevOps procedures to manage and support the CI/CD process including the automation of the build, test, deploy pipelines and configuration management.
- Employ best practices for designing automation processes and utilities that can be easily used by the development teams.
- Design and develop a best practice release management process that employs separation of control and proper approvals.
- Closely partner with the security and infrastructure teams to incorporate corporate standards into the CI/CD and provisioning processes.
- Maintaining source control management system and integrating it with software build and deployment.
- Responsibility for the build environment: resolve build issues, help coordinate complex software test environments and software releases.
- Monitoring of Applications operational processes, escalating and facilitating failure resolution as appropriate.
Qualifications: Required
- 5+ years of professional experience of working with the full software development life cycle and designing/developing best practice CI/CD pipelines, GitHub Actions, Ansible (IaC), Terraform/CloudFormation, K8s, test automation, static code analysis, Artifactory and release management processes.
- Proficient in at least two of the following Windows batch/PowerShell, bash, Python.
- Knowledgeable about networking (TCP, UDP, ICMP, ARP, DNS, TLS, HTTP, SSH, NAT, firewall, load balancing, etc).
- Strong experience with managing and support of Windows/Linux Servers.
- Good understanding of deployment of various platforms such as web/REST API, messaging bus/queue, application services, Microservices and Cloud Serverless components/managed platform.
- Experience working with relational databases/SQL and no-SQL, other database technologies are a plus.
- A curiosity concerning technology and the ability to learn new systems and tools quickly.
- Excellent communication skills and the ability to work in a collaborative environment.
Preferred
- Experience with Cloud solutions i.e Azure (VNet, privateLink, Blob storage, Azure SQL, Web App, Data Factory, Client, AKS, ARO, SQL Server/Cosmos) / AWS (VPC, EC2, S3, Route53, ECS, EKS, RDS, ALB/NLB).
- Experience with code-quality (SonarQube, GitHub Enterprise Advanced Security/CodeQL, Jfrog Artifactory + Xray).
- Experience with containers and orchestration technologies (Docker, K8s, OpenShift).
- Experience with application telemetry, monitoring and alerting solutions (Splunk, LogicMonitor, AWS CloudWatch, Azure Insight or similar).
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.