Array Declaration In Data Structure Jobs Hiring Now Jobs in Usa
55,130 positions found
Minimum of five years experience working in analytics with hospitals and health plans.
Advanced proficiency required with VBA, SQL, Salesforce, Excel and Access.
High-level skills using web applications and all browsers; ability to teach others how to use web-based database functions.
Demonstrated experience using Microsoft Office computer applications, including Word, Access, Outlook and SharePoint.
Advanced knowledge of Excel required.
Detail-oriented with strong follow-through and ability to work independently given standard guidelines and checklists.
Good writing and communication skills.
Able to draft grammatically correct and professional email messages.
Demonstrated experience in working successfully with minimal supervision.
Must have knowledge of medical and health care terminology.
Ability to complete HIPAA training and implement high-level protections on patient information and confidentiality.
Must work effectively independently and in a team setting.
Ability to relate well with internal and external customers.
Quality/Metrics: Gather and perform analysis on data from Salesforce, Loopback, Excel, and other databases as required.
Perform data cleaning as needed to ensure data are consistent and analyzable.
Create data reports, charts, graphs and tables for regular reporting to program leads and external partners.
Export data from software systems and program tracking logs for agency reporting.
Assemble reports, papers and presentation materials as directed.
Collect data through phone and in-person interviews.
Record or transcribe data in accordance with project and funding source guidelines.
Perform literature reviews (locating, listing &/or abstracting articles).
Enter literature references into shared database (such as EndNote) Responsibilities: Data cleaning, formatting, and maintenance as needed.
Data visualization and analysis of program metrics.
Data Entry for the program(s) assigned.
Program reporting/billing/invoicing support.
Administrative duties as needed (Mailing and other assigned work) Establish and maintain systems for program accountability – reports track performance.
Attend and ensure follow up after all meetings and presentations – minutes, reports, action plans, assignments, and etc.
Monitors performance, responsibilities of field staff with respect to database management, metrics, and documents.
Reports all errors in systems, workflows, and both internal and external individuals.
Completes reporting (both internal and contractual requirements) with thorough knowledge and understanding of what is being reported.
Develops and maintains a current understanding of the Department’s Contractual Agreements.
Must have professional verbal and written skills, computer/software skills.
Assists with both internal and external customer service calls, emails, and requests.
Other Miscellaneous tasks assigned, as needed.
SQL Server database design, implementation, troubleshooting Develop, optimize, and maintain complex T-SQL queries, stored procedures, indexes, constraints; resolve performance issues, deadlocks, and contentions using traces, execution plans, and profiling.
Design, develop, test, and implement ETL/ELT processes using Talend for data extraction, transformation, and loading from diverse sources, including Salesforce CRM data.
Administer and optimize Talend environment, including job scheduling, dependencies, monitoring, automation, patches, upgrades, and performance tuning.
Integrate Salesforce data (e.g., via APIs, connectors) into SQL Server databases and data warehouses, ensuring data quality, synchronization, and real-time/ batch processing.
Collaborate face-to-face/with business stakeholders to analyze requirements, gather specifications, evaluate data sources/targets, and design solutions that improve business performance.
Lead ETL development activities, ensure code quality, provide feedback on performance.
Support enterprise data warehouse, data marts, and business intelligence initiatives; perform source data analysis and dimensional modeling.
Develop and automate processes using scripting.
Provide tier 2/3 support, evaluate production issues, recommend improvements, and participate in project planning following Agile methodologies.
Perform proactive performance optimization, and data synchronization across environments Mentor staff, recommend process enhancements, and contribute specialized knowledge across IT and business operations.
Document data integration processes, workflows, ETL designs, data mappings, technical specifications, and system configurations Manage version control, deployments Collaborate on testing (unit, integration, UAT Translated business requirements into actionable data specifications, documentation, and code solutions using Salesforce Object Manager and official documentation Reviewed Salesforce release notes, verified production deployments, and conducted feature testing across sandbox and production environments with detailed feedback submission Developed and maintained complex SOQL queries to support data team operations, reporting, and analytics needs Designed and built custom Salesforce reports to support data operations and Enhanced Care Management (ECM) programs Developed and deployed end-to-end solutions for processing health plan MIF data, enabling efficient insert, update, and reporting workflows for Lead and Case objects Performed large-scale data inserts, updates, and migrations using Salesforce Data Loader in both sandbox and production environments Extracted, analyzed, and transformed backend Salesforce data using Talend and SQL to produce accurate reports for compliance, billing, and operational needs Identified and resolved reporting discrepancies and data quality issues through root-cause analysis and targeted corrections Cleaned, standardized, and transformed referral data for mass uploads into Salesforce while enforcing validation rules and workflow requirements Created Salesforce-based error reports that enabled program teams to quickly identify and correct data entry issues Conducted data gap analyses against vendor reporting requirements and designed field transformations and new data structures to meet compliance and reporting standards Integrated offshore datasets with Salesforce records to address missing or incomplete data, improving accuracy for reporting and billing Reduced manual data entry and correction efforts by automating large-scale updates, inserts, and fixes via Salesforce Data Loader Maintained vendor zip code records in Salesforce to ensure accurate service area tracking, correct billing rates, and reliable historical reference Partners in Care Foundation is an equal opportunity employer.
We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.
It is our intent to maintain a work environment which is free of harassment, discrimination, or retaliation because of age, race (including hair texture and protective hairstyles, such as braids, locks, and twists), color, national origin, ancestry, religion, sex, sexual orientation, pregnancy (including childbirth, lactation/breastfeeding, and related medical conditions), physical or mental disability, genetic information (including testing and characteristics, as well as those of family members), veteran status, uniformed service member status, gender, gender identity, gender expression, transgender status, arrest or conviction record, domestic violence victim status, credit history, unemployment status, caregiver status, sexual and reproductive health decisions, salary history or any other status protected by federal, state, or local laws.
All qualified applicants will receive consideration for employment and reasonable accommodations may be made to enable qualified individuals to perform the essential functions of the position.
Remote working/work at home options are available for this role.
Doctor of Medicine | Psychiatry - General/Other
Location: Indianapolis, IN
Employer:
Pay: Competitive weekly pay (inquire for details)
Start Date: ASAP
About the Position
LocumJobsOnline is working with to find a qualified Psychiatry MD in Indianapolis, Indiana, 46201!
This Job at a Glance
- Job Reference Id: ORD-210224-MD-IN
- Title: MD
- Dates Needed: April 3rd - July 3rd
- Shift Type: Day Shift
- Assignment Type: Inpatient
- Call Required: No
- Board Certification Required: No
- Job Duration: Locums
This inpatient psychiatric hospital specializes in advanced diagnosis and stabilization of complex mental health cases. The facility provides comprehensive psychiatric services with dedicated resources for intensive treatment of patients requiring specialized inpatient care. The hospital maintains modern documentation systems and interdisciplinary treatment teams to ensure effective delivery of psychiatric services in a structured inpatient environment.
About the Facility LocationIndianapolis features notable attractions including the Indianapolis Motor Speedway Museum and the Children's Museum, along with various specialty museums and walking tour opportunities. The region offers diverse entertainment venues such as the Ruoff Music Center, Everwise Amphitheatre, Clowes Memorial Hall, and Old National Centre, providing options for arts, live music, sports, and shopping experiences. Downtown presents dining and beverage establishments alongside cultural and recreational activities suitable for various interests.
About the Clinician's WorkdayThe psychiatrist will direct psychiatric services in accordance with institutional policies, utilizing advanced clinical training and professional judgment to guide diagnostic evaluations, treatment plans, and patient care decisions. This position serves as the final authority on psychiatric evaluations while overseeing program planning and reviewing clinical recommendations. The clinician will work part-time for 24 hours per week during day shifts, conducting weekly rounds on a 24-bed inpatient unit and managing complex mental health cases requiring advanced diagnosis and stabilization. Responsibilities include contributing to departmental strategy and agency-wide policy through research and program evaluation while maintaining comprehensive documentation of patient progress and treatment interventions.
Additional Job Details
- Case Load/PPD: 24 beds / rounding 1x a week
- Support Staff: Nursing staff, medical assistants, and administrative support
- Patient Population: Adults
- Location Type: On-Site
- Government: No
- Shift Hours: Part time (24 hours)
- Cases Treated: Complex mental health cases requiring advanced diagnosis and stabilization
- Average Length of Stay: Variable based on patient complexity and treatment needs
- Census: 24 beds / rounding 1x a week
- Med Checks/Follow-up per day: Variable b
Contact:
About
The need has never been greater to connect great clinicians and great healthcare facilities. That’s what we do. Every day. We’re . We connect clients and clinicians to take care of patients. How do we do it? By doing it better than everyone else. Whether you’re looking for a locum tenens job or locum tenens coverage, our experienced agents have the specialized knowledge, know-how, and personal relationships to take care of you and your search.
provides comprehensive onboarding and optional 1099 financial consulting from a partner advisor.
We cover your malpractice insurance (A++) and provide assistance with credentialing, privileging, licensing, housing and travel.
Our agents have the specialized knowledge and personal connections to provide the best locum tenens experience and negotiate top pay on your behalf.
1714077EXPPLAT
DEPLOY has been retained to find a Reporting & Data Architect Lead combines advanced reporting development with enterprise-level data governance and architectural leadership. In this role, you will own our client's enterprise reporting platform—designing robust Power BI solutions, managing shared data models, and ensuring the reporting environment remains secure, scalable, and high-performing.
You will also own our client's enterprise reporting standards and governance framework, ensuring reporting across all departments is consistent, trusted, and aligned with best practices. This includes defining reporting conventions, reviewing changes, onboarding departmental report creators, and stewarding enterprise reporting assets such as certified datasets and endorsed reports.
At the enterprise level, you will architect our client's data framework—defining how data is structured, named, documented, and shared across ERP, operational, manufacturing, and corporate systems. You will own the enterprise data dictionary, the centralized semantic model, and key architectural decisions around Microsoft Fabric and other data tooling. This role interacts frequently with executives to align data strategy with organizational growth and reporting needs.
Key Responsibilities
Enterprise Reporting (Hands-On Development)
- Build, optimize, and maintain enterprise-grade Power BI reports, dashboards, datasets, and data models.
- Develop and govern shared semantic models and reusable datasets that power enterprise-wide reporting.
- Use Microsoft Fabric, Dataverse, and related ETL/data management tools to shape and integrate reporting data sources.
- Manage dataset refresh schedules, performance tuning, workspace organization, gateway configuration, and reporting system reliability.
- Implement row-level security (RLS), workspace access patterns, and enterprise reporting permissions—Responsible, with the Director of Technology Accountable.
- Manage reporting governance artifacts including certified datasets, endorsed reports, and enterprise workspace standards.
- Support reporting scalability as our client grows (new factories, new business units, new product lines).
Enterprise Reporting Standards & Governance
- Own our client's enterprise reporting standards framework, covering naming conventions, modeling patterns, documentation practices, lifecycle management, visual design standards, and change control.
- Govern reporting development and deployment across the organization to ensure consistency and prevent duplicate or conflicting models.
- Review and approve reporting change requests, data model modifications, and access requests.
- Lead documentation and enablement for departmental report creators through training, guidance, and structured onboarding.
- Provide strategic direction around reporting maturity, sustainability, and enterprise alignment.
Enterprise Data Architecture
- Design and maintain our client's enterprise data architecture framework across ERP, operational, manufacturing, and corporate systems.
- Own the enterprise data dictionary, defining canonical field names, table structures, business definitions, and version control practices.
- Build and govern the centralized semantic model that powers reporting across the company.
- Advise and strongly influence enterprise-level decisions around Microsoft Fabric, data modeling strategy, and long-term architectural direction—and own the work that follows those decisions.
- Collaborate with engineering and system owners to coordinate schema changes, data integrations, and cross-system alignment.
Leadership & Collaboration
- Partner with C-suite and senior leaders to define reporting roadmaps, enterprise priorities, and data strategy.
- Communicate complex architectural concepts in clear, business-friendly terms.
- Lead cross-functional initiatives that require unified data structures or scalable reporting.
- Apply automation (Power Automate, Fabric pipelines) and AI tools to improve reporting efficiency, data quality, and governance workflows.
Ideal Candidate Profile
- Deep hands-on expertise with Power BI, Microsoft Fabric, data modeling, and cloud data platforms.
- Track record of establishing and enforcing enterprise reporting standards and governance.
- Strong architectural intuition: semantic modeling, master data definition, cross-system alignment, and scalable design.
- Able to operate as both an individual contributor and a strategic leader.
- Experience managing reporting governance artifacts (certified datasets, endorsed reports, workspace strategy).
- Comfortable influencing architectural decisions and guiding technical execution.
- Strong command of foundational tools and languages such as:
- DAX
- Power Query / M
- SQL
- Fabric pipelines / ETL tooling
- Experience with automation and AI-assisted analytics workflows.
Description
Meditech Data Extraction & Reporting Engineer
Position Overview
We are seeking an experienced Meditech data engineering specialist to support healthcare data archiving and legacy application retirement projects.
This role focuses on extracting data from legacy Meditech systems and transforming it into relational SQL databases while also preserving Meditech reporting outputs and report logic for use within an archive platform.
The ideal candidate has deep experience with Meditech Client/Server data structures, NPR reporting frameworks, and the Meditech data dictionary. This individual will design and implement processes that convert Meditech hierarchical data structures into normalized relational schemas and enable reproduction or preservation of Meditech reports within a long-term archive environment.
________________________________________
Required Qualifications
Meditech Platform Experience
Strong hands-on experience working with Meditech environments such as (but not limited to):
• Meditech Client/Server
• Meditech Magic
• Meditech 6.x
Experience working with:
• Meditech DPM structures
• NPR reporting systems
• Meditech dictionaries and pointer relationships
• Meditech segment layouts
________________________________________
Technical Skills
• Advanced SQL development experience
• Experience designing relational database schemas
• Experience translating hierarchical data models into relational structures
• Experience building data extraction and transformation pipelines
Job Type & Location
This is a Contract position based out of Columbus, OH.
Pay and BenefitsThe pay range for this position is $55.00 - $80.00/hr.
Eligibility requirements apply to some benefits and may depend on your job
classification and length of employment. Benefits are subject to change and may be
subject to specific elections, plan, or program terms. If eligible, the benefits
available for this temporary role may include the following:
• Medical, dental & vision
• Critical Illness, Accident, and Hospital
• 401(k) Retirement Plan – Pre-tax and Roth post-tax contributions available
• Life Insurance (Voluntary Life & AD&D for the employee and dependents)
• Short and long-term disability
• Health Spending Account (HSA)
• Transportation benefits
• Employee Assistance Program
• Time Off/Leave (PTO, Vacation or Sick Leave)
This is a fully remote position.
Application DeadlineThis position is anticipated to close on Mar 20, 2026.
h4>About TEKsystems:We're partners in transformation. We help clients activate ideas and solutions to take advantage of a new world of opportunity. We are a team of 80,000 strong, working with over 6,000 clients, including 80% of the Fortune 500, across North America, Europe and Asia. As an industry leader in Full-Stack Technology Services, Talent Services, and real-world application, we work with progressive leaders to drive change. That's the power of true partnership. TEKsystems is an Allegis Group company.
The company is an equal opportunity employer and will consider all applications without regards to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
About TEKsystems and TEKsystems Global Services
We’re a leading provider of business and technology services. We accelerate business transformation for our customers. Our expertise in strategy, design, execution and operations unlocks business value through a range of solutions. We’re a team of 80,000 strong, working with over 6,000 customers, including 80% of the Fortune 500 across North America, Europe and Asia, who partner with us for our scale, full-stack capabilities and speed. We’re strategic thinkers, hands-on collaborators, helping customers capitalize on change and master the momentum of technology. We’re building tomorrow by delivering business outcomes and making positive impacts in our global communities. TEKsystems and TEKsystems Global Services are Allegis Group companies. Learn more at .
The company is an equal opportunity employer and will consider all applications without regard to race, sex, age, color, religion, national origin, veteran status, disability, sexual orientation, gender identity, genetic information or any characteristic protected by law.
Materials Data Specialist
Radiant is seeking a Materials Data Specialist to support the development of our internal materials database. This role focuses on researching, validating, and organizing material property data used by engineering teams across reactor design, thermal systems, and structural analysis.
The ideal candidate has a technical background in materials science, materials engineering, or a related engineering discipline and enjoys working at the intersection of engineering research, data quality, and structured documentation.
You will evaluate the quality of material property sources, organize data into defined schemas, and contribute to documentation that helps engineers confidently use materials data in design and analysis.
Responsibilities
Research Materials Data
- Search scientific literature, databases, and reports to identify relevant material property data.
- Evaluate the quality, reliability, and applicability of material property sources.
- Flag inconsistencies or uncertainty in data sources.
Structure Engineering Data
- Enter material property data into internal databases following defined schemas and standards.
- Maintain consistent formatting and traceability of data sources.
- Ensure data integrity and reproducibility for engineering use.
Document Materials Information
- Write concise descriptions of materials and their properties in supporting reports.
- Summarize relevant test conditions, limitations, and assumptions for engineering teams.
- Maintain clear documentation of data sources and methodologies.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Get Hired by taking action.
If you just graduated (or you're about to) and the job search is already feeling confusing, you're not imagining it.
A degree proves you can learn—but employers hire for job readiness: projects that look like real work, current tech stacks, interview confidence, and the ability to contribute on day one.
That's why many new grads send hundreds of applications and still hear nothing back.
It's not because you're "not smart enough.” It's because most entry-level pipelines are crowded, and hiring teams filter heavily for candidates who look production-ready.
We are actively considering candidates for entry-level software engineering and data roles, especially Java full stack, Java/Python development, DevOps automation, data analytics, data engineering, data science, and ML/AI—full-time opportunities aligned to client needs.
Our core emphasis remains Java/Full Stack/DevOps and Data/Analytics/Engineering/ML.
SynergisticIT focuses on two high-demand lanes: Java / Full Stack / DevOps and Data (Data Analyst, Data Engineer, Data Scientist) + ML/AI—so you don't graduate with scattered skills, you graduate with an employable stack.
SynergisticIT since 2010, has helped candidates land full-time roles at major organizations (examples often cited include Google, Apple, PayPal, Visa, Western Union, Wells Fargo, Client, Banking, Wayfair, Client, Client, and more) with offers commonly in the $95k–$154k range depending on role and skill depth.
For a new grad, the bigger message isn't the number—it's that results require a structured pathway, not random applications.
Here's a realistic way to think about your advantage as a fresh graduate: you're early enough to build the right foundation before bad habits set in.
If you master fundamentals—coding, debugging, data structures, system thinking—and then layer modern tools on top (frameworks, cloud, CI/CD, analytics stacks), you become the kind of "entry-level” candidate who actually feels like a safe hire.
What roles are companies hiring for right now? A typical market demand pattern is clear: organizations still need entry-level software programmers, Java full stack developers, Python/Java developers, DevOps-focused engineers, and on the data side data analysts, BI analysts, data engineers, data scientists, and machine learning engineers.
The strongest candidates aren't "tool collectors”—they're people who can show end-to-end capability: build an API, connect a database, deploy a service, analyze data, explain results, and handle interviews calmly.
Why fresh grads get stuck— Fresh grads often struggle for four predictable reasons: Resume doesn't match job keywords (ATS filters you out).
Projects look like school assignments (not production-aligned).
Interview skills are undertrained (DSA, system design, SQL, behavioral).
No structured pipeline (random applying without feedback loops).
A job-placement-first approach addresses these systematically: build the right portfolio, practice the right interview questions, align your tech stack to roles, and keep improving until the market says "yes.” Who this path fits best If you're a recent graduate, you'll likely fit if you match any of these: New grads in CS, Engineering, Math, or Statistics with limited job experience Students finishing Bachelor's or Master's programs who need a real hiring plan Candidates who apply consistently but don't get callbacks Candidates who reach interviews but struggle to close International students on F-1/OPT who need a job plan for STEM extension/H-1B timing Graduates with strong academics but thin practical experience SynergisticIT helps STEM extension and work authorization pathways, and for candidates who need long-term stability, support related to H-1B and green card processes as part of employer-side realities.
If you're tired of guessing, stop treating your job search like a lottery.
Treat it like a project with milestones: skills → portfolio → interview readiness → targeted applications → scheduled interviews → offer.
If you want to explore, here are the key links: Event videos (OCW, JavaOne, Gartner): USA Today feature Contact & get a roadmap: Please read our blogs Why do Tech Companies not Hire recent Computer Science Graduates | SynergisticIT What Recruiters Look for in Junior Developers | SynergisticIT Software engineering or Data Science as a career? How OPT Students Can Land Tech Jobs – SynergisticIT Bottom line for fresh grads: Your degree is the starting line, not the finish line.
If you want to get hired faster, you don't need "more random courses.” You need a guided, job-focused path and the right people around you.
In tech, it's not just what you learn—it's how you learn and who you build with that decides how far you go.
Please note: Resume databases are shared with clients and interested clients will reach out directly if they find a qualified candidate for their req.
Resume submissions may be shared with our JOPP team database also.
Please unsubscribe if contacted or if you don't want to be contacted please don't submit your resume
Interested in helping build the next phase of hyperscale data center expansion?
BlueSky Resource Solutions is partnering with a leading infrastructure services provider who is seeking a Regional Director of Data Center Infrastructure to oversee delivery operations within a major hyperscale market.
This role will lead ISP deployments within data center environments while building operational processes, developing field teams, and ensuring high levels of client satisfaction.
The ideal candidate is a hands-on operational leader with experience managing complex infrastructure projects, supporting business growth, and maintaining strong safety, quality, and financial performance standards.
Your project direction:
- Provide leadership for structured cabling and inside plant (ISP) infrastructure projects within large-scale data center environments.
- Oversee project lifecycle activities including planning, staffing, scheduling, quality control, and final project turnover.
- Build and lead field teams including supervisors, technicians, and project support staff.
- Develop and implement operational standards, documentation practices, testing procedures, and installation guidelines aligned with industry standards.
- Maintain strong relationships with enterprise and hyperscale data center customers, ensuring service-level commitments and project milestones are met.
- Collaborate with construction partners, electrical contractors, and facility operations teams to coordinate infrastructure deployment.
- Monitor project financials including labor forecasting, materials planning, change management, and cost control.
- Identify opportunities to improve operational efficiency through standardized processes, prefabrication, and digital reporting tools.
The best fit:
- 8+ years of experience in data center infrastructure, structured cabling, or network deployment environments.
- Experience managing field teams and overseeing multiple projects.
- Demonstrated experience managing project budgets, scheduling, and operational performance metrics.
- Strong client-facing communication and leadership capabilities.
- Industry certifications are considered a plus.
- Ability to meet site access requirements including background screening and safety compliance.
Sr Data & BI Engineer (Hybrid)
We’re partnering with a growing organization seeking a SQL-focused Data & BI Engineer to build and optimize data pipelines, support ETL processes, and drive reporting infrastructure. This role sits at the intersection of data engineering and business intelligence, with strong visibility across teams and leadership.
What You’ll Do
- Design, build, and maintain SQL-based data pipelines and transformations
- Develop and optimize ETL processes to support reporting and analytics
- Write performant SQL for data modeling, transformation, and downstream consumption
- Support and enhance reporting infrastructure (SSRS → Power BI migration)
- Partner with business and technical teams to deliver scalable data solutions
- Improve data quality, structure, and accessibility across systems
- Contribute to performance tuning and optimization of data workflows
What You Bring
- Strong SQL skills with experience in data transformation and pipeline development
- Experience with ETL tools or frameworks (SSIS or similar)
- Exposure to BI tools such as Power BI or SSRS
- Experience working with structured data models in a production environment
- Ability to operate across both data engineering and reporting use cases
Environment
- Hybrid: 3 days onsite
- Evolving data environment with active investment in modernization
- Transitioning reporting stack from SSRS to Power BI
- Collaborative team with dedicated DBA support
Compensation
$120K – $140K base + bonus potential and good benefits
Job Overview:
We are seeking a Data Engineer to support data pipeline development and ETL processes.
Responsibilities:
Build and maintain ETL pipelines
Optimize database performance
Work with structured and unstructured data
Ensure data integrity
Requirements:
Degree in Data Engineering or related field
Knowledge of SQL, Python
Familiarity with data warehousing concepts