How To Charge Mtc Data Jobs in Usa

12,701 positions found

Databricks Data Engineer
✦ New
Salary not disclosed
Farmington, CT 6 hours ago

**Must be able to be onsite in Farmington, CT 2 days a week for collaboration**


The Opportunity: We are seeking a software engineer/developer or ETL/data integration/big data developer with experience in projects emphasizing data processing and storage. This person will be responsible for supporting the data ingestion, transformation, and distribution to end consumers. Candidate will perform requirements analysis, design/develop process flow, unit and integration tests, and create/update process documentation.


· Work with the Business Intelligence team and operational stakeholders to design and implement both the data presentation layer available to the user community, as well as the underlying technical architecture of the data warehousing environment. · Develop scalable and reliable data solutions to move data across systems from multiple sources in real time as well as batch modes. · Design and develop database objects, tables, stored procedures, views, etc. · Independently analyze, solve, and correct issues in real time, providing problem resolution end-to-end · Design and develop ETL Processes that will transform a variety of raw data, flat files, xl spreadsheets into SQL Databases · Understands the concept of Data marts and Data lakes and experience with migrating legacy systems to data marts/lake · Uses additional cloud technologies (e.g., understands concept of Cloud services like Azure SQL server) · Maintain comprehensive project documentation · Aptitude to learn new technologies and the ability to perform continuous research, analysis, and process improvement. · Strong interpersonal and communication skills to be able to work in a team environment to include customer and contractor technical, end users, and management team members. · Manage multiple projects, responsibilities and competing priorities.


Requirements Experience Needed: · Programming languages, frameworks, and file formats such as: Python, SQL, PLSQL, and VB · Database platforms such as: Oracle, SQL Server, MySQL · Big data concepts and technologies such as Synapse & Databricks · AWS and Azure cloud computing · HVR data replication

Not Specified
Project Administrator - Data Center Information Technology/Design/Engineer Firm
✦ New
Salary not disclosed
Edison, NJ 6 hours ago

Job Title: Project Administrator - Data Center Information Technology/Design/Engineer Firm

Job Type: Full-time

Job Location: On-Site Edison, NJ


Project Administrator

NJ Data Center growing Information Technology/Design/Engineer Firm is seeking a Project Administrator to join our team. The Project Administrator is responsible for managing the administrative tasks and logistical aspects of data center construction or expansion projects, including coordinating with various teams, tracking project progress, maintaining documentation, and ensuring smooth execution of project activities under the guidance of a project manager, all while adhering to deadlines and budget constraints; essentially acting as the organizational backbone for the project.


Responsibilities include but are not limited to:

  1. Support project team on all administrative tasks and duties.
  2. Heavy client/vendor/supplier interaction.
  3. Preparation of spreadsheet reports, contracts documents, purchase and change order requests, presentations, and correspondence.
  4. Receive, maintain and distribute submittals, RFI’s, shop drawings and establish project log to record receipt and disposition of same.
  5. Coordinate project meetings and travel arrangements.
  6. Maintain electronic and manual database of all project files and archives.
  7. Other responsibilities normally performed in the execution of a Project Administrator position according to standard Architectural/Engineering industry practices.
  8. Assist architects/engineers with editing/issuing project book specifications (electronic IE: Master Specs).


Qualifications:

  1. Three to five years experience in Engineering firms or related fields.
  2. BA is recommended.
  3. Must exhibit initiative, judgment, and quality in performance and responsibilities.
  4. Deadline and detail oriented.
  5. Proficiency in Microsoft Office, Word, Excel, Outlook, Power Point.
  6. Ability to work well with multi disciplines in a fast paced environment.


Work Schedule:

This is a full-time job position. In office, not remote or virtual.

Normal business hours are Monday thru Friday 8-5pm.


Benefits:

  • 401 k match to $3500
  • Full health medical/dental/prescription/life insurance (75% paid by employer 20-25% paid by employee)
  • After 3/5 years eligibility based on rating for company owned NJ shore house
  • Tuition reimbursement for employee
  • Subjective year end bonus plan(end September)
  • Awards/recognition for superior effort and extraordinary excellence
  • Longevity awards 5/10/15/20/25/30 plus years
  • After 15 years education assistance for children eligibility
  • After 15 years eligibility for additional retirement compensative (elective)


BRUNS-PAK is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, gender identity and/or expression, national origin, disability, veteran, or other protected status.

Not Specified
Staff Data Engineer ( Boston or Chicago )
✦ New
Salary not disclosed
Chicago, IL 1 day ago

Company Description

PG Forsta is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.

Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.

Our Mission:

We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.

Our Values:

To put Human Experience at the heart of organizations so every person can be seen and understood.

  • Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.
  • Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.
  • Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.
  • Dare to innovate:We challenge the status quo with creativity and innovation as our true north.
  • Better together:We check our egos at the door. We work together, so we win together.
Press Ganey is looking to hire a self-motivated Staff Data Engineer with data platform experience.The Staff Data Engineer (Platform) will play a crucial role in designing, implementing and architecting frameworks, systems and automation that support the development, deployment and observability of state-of-the-art large language models (LLMs) and generative AI solutions. This position focuses on creating scalable, reliable systems and processes that streamline the developer experience and empower analysts and data scientists. The ideal candidate will have strong foundational skills in cloud infrastructure, automation and devops practices, as well as experience implementing data pipelines and deployment automation for ML and analytical workloads.

Duties & Responsibilities

Design and implement processes, systems and automation to streamline the development and deployment of AI solutions.
Architect robust, reliable solutions for specific AI applications using appropriate cloud-based and open source technologies.
Design and automate data pipelines to deliver complex data products to power training and online inference of AI systems.
Deploy ML models, LLMs and GenAI systems into production, ensuring reliability, efficiency, and scalability across cloud or hybrid environments.
Build and maintain robust CI/CD pipelines tailored to ML model lifecycle management, ensuring a streamlined and agile deployment process.
Monitor model performance, identify potential improvements, and integrate feedback loops for continuous learning and adaptation.
Integrate models with chat interfaces and conversational platforms to create responsive, user-centric applications.
Investigate and implement agent-based architectures that support conversational intelligence and interaction modeling.
Collaborate with cross-functional teams to design AI-driven features that enhance user experience and interaction within chat interfaces.
Work closely with data scientists, product managers, and engineers to ensure alignment on project goals, data requirements, and system constraints.
Mentor junior engineers and provide guidance on best practices in ML model development, deployment, and maintenance.
Create and maintain comprehensive documentation for model architectures, code implementations, data workflows, and deployment procedures to ensure reproducibility, transparency, and ease of collaboration.
Technical Skills

Experience with large-scale deployment tools and environments, including Docker, Kubernetes, and cloud platforms like AWS, Azure, or GCP.
Experience deploying and managing a variety of database technologies.
Experience deploying ML models at scale and optimizing models for low-latency, high-availability environments.
Strong programming skills in Python and proficiency in libraries such as NumPy, Pandas, and Scikit-learn.
Experience with data pipelines, ETL processes, and experience with distributed data frameworks like Apache Spark or Dask.
Familiarity with machine learning frameworks such as TensorFlow, PyTorch, and Hugging Face Transformers.
Knowledge of conversational AI, agent-based systems, and chat interface development.
Proven track record in deploying and maintaining ML and AI solutions in a production setting.
Experience with version control (e.g., Git) and CI/CD tools tailored to ML workflows.
Experience with MLOps.
Experience with Databricks is a plus.

Qualifications

Minimum Qualifications

5+ years of experience in platform engineering with a focus on with a focus on data and ML systems.
Bachelor's degree in Computer Science, Engineering, Data Science, or a related field.

Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.

Additional Information for US based jobs:

Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.

Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.

The expected base salary for this position ranges from $100,000 to $140,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.

All your information will be kept confidential according to EEO guidelines.

Our privacy policy can be found here:legal-privacy/

Not Specified
Staff Data Engineer
✦ New
🏢 PG Forsta
Salary not disclosed
Emeryville, CA 1 day ago

Company Description

Press Ganey is the leading experience measurement, data analytics, and insights provider for complex industries-a status we earned over decades of deep partnership with clients to help them understand and meet the needs of their key stakeholders. Our earliest roots are in U.S. healthcare -perhaps the most complex of all industries. Today we serve clients around the globe in every industry to help them improve the Human Experiences at the heart of their business. We serve our clients through an unparalleled offering that combines technology, data, and expertise to enable them to pinpoint and prioritize opportunities, accelerate improvement efforts and build lifetime loyalty among their customers and employees.

Like all great companies, our success is a function of our people and our culture. Our employees have world-class talent, a collaborative work ethic, and a passion for the work that have earned us trusted advisor status among the world's most recognized brands. As a member of the team, you will help us create value for our clients, you will make us better through your contribution to the work and your voice in the process. Ours is a path of learning and continuous improvement; team efforts chart the course for corporate success.

Our Mission:

We empower organizations to deliver the best experiences. With industry expertise and technology, we turn data into insights that drive innovation and action.

Our Values:

To put Human Experience at the heart of organizations so every person can be seen and understood.

  • Energize the customer relationship:Our clients are our partners. We make their goals our own, working side by side to turn challenges into solutions.

  • Success starts with me:Personal ownership fuels collective success. We each play our part and empower our teammates to do the same.

  • Commit to learning:Every win is a springboard. Every hurdle is a lesson. We use each experience as an opportunity to grow.

  • Dare to innovate:We challenge the status quo with creativity and innovation as our true north.

  • Better together:We check our egos at the door. We work together, so we win together.

We are seeking an experienced Staff Data Engineer to join our Unified Data Platform team. The ideal candidate will design, develop, and maintain enterprise-scale data infrastructure leveraging Azure and Databricks technologies. This role involves building robust data pipelines, optimizing data workflows, and ensuring data quality and governance across the platform. You will collaborate closely with analytics, data science, and business teams to enable data-driven decision-making.

Duties & Responsibilities:

  • Design, build, and optimizedata pipelinesand workflows inAzureandDatabricks, including Data Lake and SQL Database integrations.
  • Implement scalableETL/ELT frameworksusingAzure Data Factory,Databricks, andSpark.
  • Optimize data structures and queries for performance, reliability, and cost efficiency.
  • Drivedata quality and governance initiatives, including metadata management and validation frameworks.
  • Collaborate with cross-functional teams to define and implementdata modelsaligned with business and analytical requirements.
  • Maintain clear documentation and enforce engineering best practices for reproducibility and maintainability.
  • Ensure adherence tosecurity, compliance, and data privacystandards.
  • Mentor junior engineers and contribute to establishingengineering best practices.
  • SupportCI/CD pipeline developmentfor data workflows using GitLab or Azure DevOps.
  • Partner with data consumers to publish curated datasets into reporting tools such asPower BI.
  • Stay current with advancements inAzure, Databricks, Delta Lake, and data architecture trends.

Technical Skills:

  • Advanced proficiency inAzure 5+ years(Data Lake, ADF, SQL).
  • Strong expertise inDatabricks (5+ years),Apache Spark (5+ years), andDelta Lake (5+ years).
  • Proficient inSQL (10+ years)andPython (5+ years); familiarity withScalais a plus.
  • Strong understanding ofdata modeling,data governance, andmetadata management.
  • Knowledge ofsource control (Git),CI/CD, and modern DevOps practices.
  • Familiarity withPower BIvisualization tool.

Minimum Qualifications:

  • Bachelor's or Master's degree in Computer Science, Data Science, or related field.
  • 7+ yearsof experience in data engineering, with significant hands-on work incloud-based data platforms (Azure).
  • Experience buildingreal-time data pipelinesand streaming frameworks.
  • Strong analytical and problem-solving skills.
  • Proven ability tolead projectsand mentor engineers.
  • Excellent communication and collaboration skills.

Preferred Qualifications:

  • Master's degree in Computer Science, Engineering, or a related field.
  • Exposure tomachine learning integrationwithin data engineering pipelines.

Don't meet every single requirement?Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Press Ganey we are dedicated to building a diverse, inclusive and authentic workplace, so if you're excited about this role but your past experience doesn't align perfectly with every qualification in the job description, we encourage you to apply anyways. You may be just the right candidate for this or other roles.

Additional Information for US based jobs:

Press Ganey Associates LLC is an Equal Employment Opportunity/Affirmative Action employer and well committed to a diverse workforce. We do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity, veteran status, and basis of disability or any other federal, state, or local protected class.

Pay Transparency Non-Discrimination Notice - Press Ganey will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information.

The expected base salary for this position ranges from $110,000 to $170,000. It is not typical for offers to be made at or near the top of the range. Salary offers are based on a wide range of factors including relevant skills, training, experience, education, and, where applicable, licensure or certifications obtained. Market and organizational factors are also considered. In addition to base salary and a competitive benefits package, successful candidates are eligible to receive a discretionary bonus or commission tied to achieved results.

All your information will be kept confidential according to EEO guidelines.

Our privacy policy can be found here:legal-privacy/

Not Specified
Data Quality Specialist
Salary not disclosed
Kennett Square, PA 3 days ago

Job Description:

Overview:

We don't simply hire employees. We invest in them. When you work at Chatham, we empower you - offering professional development opportunities to help you grow in your career, no matter if you've been here for five months or 15 years. Chatham has worked hard to create a distinct work environment that values people, teamwork, integrity, and client service. You will have immediate opportunities to partner with talented subject matter experts, work on complex projects, and contribute to the value Chatham delivers every day.

We seek to enhance our Controls and Data Integrity team with a role specializing in data quality for interest rate, currency, and commodity transactions. The role is part of our global central operations group charged with ensuring the accuracy and reliability of Chatham's transaction, market, and valuation data.

In this role you will:

The purpose of the role is to ensure all transaction details are in Chatham's systems accurately and as agreed upon at execution. Data entry errors can have significant consequences to the economics of the transaction or to their accounting treatment, and it is therefore critical that team members understand transaction-related market conventions, payments, and valuations. This role will provide support for transactions executed by Chatham's real estate, private equity, corporate, and financial institutions sectors. We expect primary responsibilities to include:

  • Transaction and data review
    • Work as part of the larger team to check the data entry on transactions as they are executed
    • Verify calculation amounts and build payment schedules
    • Develop an understanding of the underlying transactions in order to identify loading errors
    • Check daily control reports to monitor unusual movements in transaction valuations and market data
    • Assist with data clean-up related transaction data and Client Relationship Management (CRM) software
  • Communicate and coordinate across other internal teams and with clients
    • Interact with sector team members to verify/clarify data, as needed
    • Work with internal models, analytics, and technology teams to resolve issues
    • Play an active role in liaising between the business and technical teams
    • Check and send out monthly valuation reports to clients
  • Develop and share subject matter expertise
    • Take part in the training of new Chatham employees on sector teams
    • Serve as an integral member of ad hoc project teams to improve processes, solve problems, and provide insight from a data quality perspective
    • Develop SQL skills and help create database queries
  • The role may also include opportunities to contribute to the team in other capacities as interests and team needs align.

Your impact:

Our team works in partnership with Chatham's sector advisory teams and clients to help them efficiently navigate the data quality, operational, and regulatory compliance aspects of a transaction. We strive to continually improve the workflows we are responsible for and have the chance to do so by implementing process changes and/or leveraging supporting technology. Team members play a crucial role in these process improvements and serve as subject matter experts, providing regular training and resources for all Chatham teams.

Contributors to your success:

  • 2 years of experience working in operations or data quality may be beneficial but is not required
  • An interest in data quality, data management, and process improvement
  • Comfort with basic math skills and use of Microsoft Excel
  • High level of attention to detail, accuracy, and organization
  • Ability to multitask and independently prioritize workload
  • Strong verbal and written communication skills
  • Ability to work extra/non-standard hours around month- and quarter-ends (and other special cases) to support critical business processes
  • Experience with VBA and SQL are beneficial, but not necessary

We seek individuals that will thrive in our culture and can make a significant impact over the long term. Most of our team members do not come to Chatham with a deep understanding of derivatives; therefore, we conduct classroom and apprentice-style training. We look for people who have consistently demonstrated drive, determination, and academic/professional accomplishment throughout their lives. We invest a great deal of time and training with our employees and we are looking for individuals who want to make a long-term commitment to the company.

About Chatham Financial:

Chatham Financial is the largest independent financial risk management advisory and technology firm. A leader in debt and derivative solutions, Chatham provides clients with access to in-depth knowledge, innovative tools, and an incomparable team of over 700 employees to help mitigate risks associated with interest rate, foreign currency, and commodity exposures. Founded in 1991, Chatham serves more than 3,500 companies across a wide range of industries - handling over $1 trillion in transaction volume annually and helping businesses maximize their value in the capital markets, every day. To learn more, .

Chatham Financial is an equal opportunity employer.

Not Specified
Logistics Data, Specialist I
Salary not disclosed
Mesquite, TX 2 days ago

Title: Specialist I, Logistics Data


Job Summary: The Logistics Data Specialist is responsible for managing logistics master data, validating transactional accuracy, and delivering analytics that support transportation planning, customs execution, warehousing, and freight settlement. This role partners with Operations, Procurement, Trade Compliance, and Finance to ensure information reliability and actionable reporting.


Responsibilities include:

  • Maintain carriers, lanes, rates, BOMs, HTS, and partner master data in TMS/WMS/SAP.
  • Perform audits on shipments tracking milestones, POD, cost allocation, and accrual triggers.
  • Identify root causes of data discrepancies and implement corrective actions.
  • Build SOPs for data entry, validation logic, and exception handling
  • Develop dashboards for OTIF, GIT, transit time, freight spend, accessorial, claims, and capacity utilization.
  • Provide weekly/monthly KPI packs to operations leadership.
  • Support budget vs. actual analysis and PR forecast modeling.
  • Translate business requirements into SQL/BI outputs.
  • Validate rating, fuel, and accessorial charges.
  • Support three-way match among PO, shipment, and invoice.
  • Prepare accrual and variance reports.
  • Assist audit requests from Finance
  • Act as super-user for TMS/WMS modules.
  • Drive automation to reduce manual work
  • Work with transportation, warehouse, procurement, and customs teams to improve data transparency.
  • Provide data analysis for RFPs, network optimization, and vendor reviews
  • All other duties as assigned


Qualifications:

  • Bachelor’s degree in supply chain, Logistics, Business Analytics, or related discipline
  • 2+ years in logistics, transportation analytics, or supply chain systems.
  • Experience working with freight invoices, carrier data, or brokerage information is highly valued
  • Advanced Excel (pivot tables, power query, xlookups).
  • SQL or similar database querying.
  • BI tools such as Power BI, Tableau, or Looker.
  • Familiarity with SAP/TMS/WMS environments (e.g., SAP, Oracle, MercuryGate, etc.).
  • Strong analytical reasoning.
  • High attention to detail.
  • Comfortable in fast-moving, build-phase environments.


Physical Requirements and Working Conditions

  • Ability to sit for extended periods while working at a computer
  • Frequent use of hands and fingers for typing, filing, and operating office equipment
  • Occasional standing, walking, bending, and reaching
  • Ability to lift and carry light office materials (up to 10–15 lbs.), such as files or office supplies
  • Visual acuity to read screens, documents, and reports
  • Ability to attend meetings and interact with employees, clients, and vendors
Not Specified
Data Analyst-Hybrid position in Los Angeles, California
Salary not disclosed
Qualifications: Minimum of 10 years of experience required with data and metrics analysis.

Minimum of five years experience working in analytics with hospitals and health plans.

Advanced proficiency required with VBA, SQL, Salesforce, Excel and Access.

High-level skills using web applications and all browsers; ability to teach others how to use web-based database functions.

Demonstrated experience using Microsoft Office computer applications, including Word, Access, Outlook and SharePoint.

Advanced knowledge of Excel required.

Detail-oriented with strong follow-through and ability to work independently given standard guidelines and checklists.

Good writing and communication skills.

Able to draft grammatically correct and professional email messages.

Demonstrated experience in working successfully with minimal supervision.

Must have knowledge of medical and health care terminology.

Ability to complete HIPAA training and implement high-level protections on patient information and confidentiality.

Must work effectively independently and in a team setting.

Ability to relate well with internal and external customers.

Quality/Metrics: Gather and perform analysis on data from Salesforce, Loopback, Excel, and other databases as required.

Perform data cleaning as needed to ensure data are consistent and analyzable.

Create data reports, charts, graphs and tables for regular reporting to program leads and external partners.

Export data from software systems and program tracking logs for agency reporting.

Assemble reports, papers and presentation materials as directed.

Collect data through phone and in-person interviews.

Record or transcribe data in accordance with project and funding source guidelines.

Perform literature reviews (locating, listing &/or abstracting articles).

Enter literature references into shared database (such as EndNote) Responsibilities: Data cleaning, formatting, and maintenance as needed.

Data visualization and analysis of program metrics.

Data Entry for the program(s) assigned.

Program reporting/billing/invoicing support.

Administrative duties as needed (Mailing and other assigned work) Establish and maintain systems for program accountability – reports track performance.

Attend and ensure follow up after all meetings and presentations – minutes, reports, action plans, assignments, and etc.

Monitors performance, responsibilities of field staff with respect to database management, metrics, and documents.

Reports all errors in systems, workflows, and both internal and external individuals.

Completes reporting (both internal and contractual requirements) with thorough knowledge and understanding of what is being reported.

Develops and maintains a current understanding of the Department’s Contractual Agreements.

Must have professional verbal and written skills, computer/software skills.

Assists with both internal and external customer service calls, emails, and requests.

Other Miscellaneous tasks assigned, as needed.

SQL Server database design, implementation, troubleshooting Develop, optimize, and maintain complex T-SQL queries, stored procedures, indexes, constraints; resolve performance issues, deadlocks, and contentions using traces, execution plans, and profiling.

Design, develop, test, and implement ETL/ELT processes using Talend for data extraction, transformation, and loading from diverse sources, including Salesforce CRM data.

Administer and optimize Talend environment, including job scheduling, dependencies, monitoring, automation, patches, upgrades, and performance tuning.

Integrate Salesforce data (e.g., via APIs, connectors) into SQL Server databases and data warehouses, ensuring data quality, synchronization, and real-time/ batch processing.

Collaborate face-to-face/with business stakeholders to analyze requirements, gather specifications, evaluate data sources/targets, and design solutions that improve business performance.

Lead ETL development activities, ensure code quality, provide feedback on performance.

Support enterprise data warehouse, data marts, and business intelligence initiatives; perform source data analysis and dimensional modeling.

Develop and automate processes using scripting.

Provide tier 2/3 support, evaluate production issues, recommend improvements, and participate in project planning following Agile methodologies.

Perform proactive performance optimization, and data synchronization across environments Mentor staff, recommend process enhancements, and contribute specialized knowledge across IT and business operations.

Document data integration processes, workflows, ETL designs, data mappings, technical specifications, and system configurations Manage version control, deployments Collaborate on testing (unit, integration, UAT Translated business requirements into actionable data specifications, documentation, and code solutions using Salesforce Object Manager and official documentation Reviewed Salesforce release notes, verified production deployments, and conducted feature testing across sandbox and production environments with detailed feedback submission Developed and maintained complex SOQL queries to support data team operations, reporting, and analytics needs Designed and built custom Salesforce reports to support data operations and Enhanced Care Management (ECM) programs Developed and deployed end-to-end solutions for processing health plan MIF data, enabling efficient insert, update, and reporting workflows for Lead and Case objects Performed large-scale data inserts, updates, and migrations using Salesforce Data Loader in both sandbox and production environments Extracted, analyzed, and transformed backend Salesforce data using Talend and SQL to produce accurate reports for compliance, billing, and operational needs Identified and resolved reporting discrepancies and data quality issues through root-cause analysis and targeted corrections Cleaned, standardized, and transformed referral data for mass uploads into Salesforce while enforcing validation rules and workflow requirements Created Salesforce-based error reports that enabled program teams to quickly identify and correct data entry issues Conducted data gap analyses against vendor reporting requirements and designed field transformations and new data structures to meet compliance and reporting standards Integrated offshore datasets with Salesforce records to address missing or incomplete data, improving accuracy for reporting and billing Reduced manual data entry and correction efforts by automating large-scale updates, inserts, and fixes via Salesforce Data Loader Maintained vendor zip code records in Salesforce to ensure accurate service area tracking, correct billing rates, and reliable historical reference Partners in Care Foundation is an equal opportunity employer.

We are committed to complying with all federal, state, and local laws providing equal employment opportunities, and all other employment laws and regulations.

It is our intent to maintain a work environment which is free of harassment, discrimination, or retaliation because of age, race (including hair texture and protective hairstyles, such as braids, locks, and twists), color, national origin, ancestry, religion, sex, sexual orientation, pregnancy (including childbirth, lactation/breastfeeding, and related medical conditions), physical or mental disability, genetic information (including testing and characteristics, as well as those of family members), veteran status, uniformed service member status, gender, gender identity, gender expression, transgender status, arrest or conviction record, domestic violence victim status, credit history, unemployment status, caregiver status, sexual and reproductive health decisions, salary history or any other status protected by federal, state, or local laws.

All qualified applicants will receive consideration for employment and reasonable accommodations may be made to enable qualified individuals to perform the essential functions of the position.
Remote working/work at home options are available for this role.
Not Specified
Manager, Enterprise Data Services, Data Analytics Enablement Manager
Salary not disclosed
College Park 6 days ago
Job Description Summary: This position is available within the University of Maryland’s Division of Information Technology (DIT).

The University of Maryland (UMD) seeks a Manager of Data Analytics Enablement to lead the adoption and modernization of enterprise analytics capabilities that enable trusted, data-informed decision-making across campus.

This is an exciting time to join UMD as we advance enterprise data and analytics through a period of innovative growth and modernization.

This role will play a key part in shaping the future of enterprise business intelligence, advancing Microsoft Power BI and Fabric capabilities, and embedding sustainable data quality and stewardship practices into analytics workflows.

Reporting to the Director of Enterprise Data Services, this position partners with institutional leaders, IT teams, and enterprise stakeholders to deliver reliable data products, consistent metrics, and actionable insights.

The manager will lead a team of data professionals and advance practical, operational governance practices that support trusted analytics and long-term institutional impact.

Key Responsibilities: Lead the strategy, development, and continuous improvement of the university’s enterprise business intelligence environment, including Microsoft Power BI and Microsoft Fabric.

Establish standards, best practices, and architectural patterns for semantic models, dashboards, and analytics delivery.

Guide migration and modernization efforts to ensure scalable, secure, and high-performing analytics solutions.

Develop and manage an analytics intake, prioritization, and delivery framework aligned with institutional priorities.

Define and implement data quality monitoring practices to ensure reliability, accuracy, and consistency of enterprise data assets.

Partner with technical teams to embed validation, monitoring, and observability into data pipelines and lakehouse environments.

Promote consistent metric definitions and collaborate with campus stakeholders to clarify data ownership and stewardship roles.

Support adoption of metadata management, data catalog, and lineage capabilities.

Ensure analytics solutions align with university standards for security, privacy, and responsible data use.

Manage, mentor, and develop a team of analytics and data professionals, fostering a culture of quality, collaboration, and service.

Communicate analytics priorities, progress, and impact to leadership and campus partners.
**This position is considered essential and may be required to work at the normal work location or an alternative location during a major catastrophic event, weather emergency, or other operational emergency to help maintain the continuity of University services.
** **May be required to work evenings, nights, weekends, or different shifts for extended periods.
** KNOWLEDGE, SKILLS, & ABILITIES: Knowledge of data privacy and security principles and practices necessary to protect systems and data from threats.

Knowledge in areas of subject matter expertise such as databases, data modeling, ETL, reporting, data governance practices, metadata management, data stewardship, and/or regulatory compliance.

Skill in SQL or programming/scripting languages (e.g.; Python) used for integrations, data pipelines, report development, and data management.

Skill in adapting communication style to different audiences, including technical, business, and executive stakeholders.

Skill in the use of office productivity software such as Office 365 or Google Workspaces.

Ability to lead presentations and training for large groups.

Ability to manage communications and relationships with technical and business stakeholders.

Ability to collaborate effectively with other Managers, Assistant Directors, and Directors to identify and solve problems, make improvements, and address ongoing issues.

Ability to provide a team with effective direction and support in implementations using standards and techniques that lead to a repeatable and reliable solution.

Ability to ensure documentation standards and procedures are implemented for all team responsibilities.

Ability to define deadlines and manage the quality of the work delivered.

Ability to comprehend and handle interpersonal dynamics, demonstrate empathy towards team members, and effectively manage conflicts or challenging circumstances.

Ability to coach and mentor team members in order to enhance their performance, provide constructive feedback, and support skill development.

Physical Demands: Sedentary work.

Exerting up to 10 pounds of force occasionally and/or negligible amount of force frequently or constantly to lift, carry, push, pull or otherwise move objects.

Repetitive motion.

Substantial movements (motions) of the wrists, hands, and/or fingers.

The worker is required to have close visual acuity to perform an activity such as: preparing and analyzing data and figures; transcribing; viewing a computer terminal; extensive reading.

Minimum Qualifications Education: Bachelor’s degree from an accredited college or university.

Experience: Three (3) years of professional experience supporting the operations, maintenance, and administration of data systems, analytics platforms, or data management programs.

One (1) year leading or supervising professional staff.

Other: Additional work experience as defined above may be substituted on a year for year basis for up to four (4) years of the required education.

Preferences: Demonstrated experience leading business intelligence or enterprise analytics initiatives.

Experience managing or mentoring data professionals in a collaborative team environment.

Strong experience with Power BI and modern data platforms such as Microsoft Fabric, Databricks, or similar cloud-based analytics ecosystems.

Proficiency with SQL and/or Python in support of analytics, data modeling, or data quality initiatives.

Experience implementing or advancing data quality practices, including validation, monitoring, or metric standardization.

Experience supporting practical data governance activities such as establishing shared definitions, coordinating data stewardship, or implementing metadata/catalog tools.

Demonstrated ability to collaborate across diverse stakeholders and translate business needs into scalable analytics solutions.

Strong communication skills with the ability to engage both technical and non-technical audiences.

Experience using Jira or similar tools for work intake, project tracking, and prioritization.

Additional Information: Please note that all positions within the Division of Information Technology (DIT) have an in person component with expected time in our College Park, MD location per week.

Telework is not a guaranteed work arrangement.

Visa Sponsorship Information: DIT will not sponsor the successful candidate for work authorization in the United States now or in the future.

F1 STEM OPT support is not available for this position.

Required Application Materials: Resume, Cover Letter, List of three References Best Consideration Date: March 26, 2026 Open Until Filled: Yes Salary Range: $149,120.00
- $178,944.00 Please apply at: Job Risks: Not Applicable to This Position Financial Disclosure Required: No For more information on Financial Disclosure, please visit Maryland's State Ethics Commission website .

Department: DIT-EE-Enterprise Data Services Worker Sub-Type: Staff Regular Benefits Summary: For more information on Regular Exempt benefits, select this link .

Background Checks: Offers of employment are contingent on completion of a background check.

Information reported by the background check will not automatically disqualify anyone from employment.

Before any adverse decision, the finalist will have an opportunity to provide information to the University regarding disclosable background check information.

The University reserves the right to rescind the offer of employment or otherwise decline or terminate employment if the information reported by the background check is deemed incompatible with the position, regardless of when the background check is completed.

Employment Eligibility: The successful candidate must complete employment eligibility verification (on Form I-9) by presenting documents that establish identity and work authorization within the timeframe required by federal immigration law, and where applicable, to demonstrate renewed employment authorization.

Failure to complete employment eligibility verification or reverification within the timeframe set forth by law may result in suspension or termination of employment.

EEO Statement : The University of Maryland, College Park is an Equal Opportunity Employer.

All qualified applicants will receive equal consideration for employment.

Please read the University’s Equal Employment Opportunity Statement of Policy.

Title IX Non-Discrimination Notice See above description for requirements
Not Specified
SAP S/4HANA Functional Process Data Expert
Salary not disclosed
Atlanta 3 days ago
Summary: Location: Atlanta, GA Duration: 12 Months 100% Remote – open to any area Responsibilities: Partner with global and regional business stakeholders to define data requirements aligned to standardized value stream processes.

Translate business process designs into clear master and transactional data definitions for S/4HANA.

Support template design by ensuring consistent data models, attributes, and hierarchies across geographies.

Validate data readiness for end-to-end process execution (Plan, Source, Make, Deliver, Return).

Define data objects, attributes, and mandatory fields.

Support business rules, validations, and derivations.

Align data structures to SAP best practices and industry standards.

Support data cleansing, enrichment, and harmonization activities.

Define and validate data mapping rules from legacy systems to S/4HANA.

Participate in mock conversions, data loads, and reconciliation activities.

Ensure data quality thresholds are met prior to cutover.

Support the establishment and enforcement of global data standards and policies.

Work closely with Master Data and Data Governance teams.

Help define roles, ownership, and stewardship models for value stream data.

Contribute to data quality monitoring and remediation processes.

Support functional and integrated testing with a strong focus on data accuracy.

Validate business scenarios using migrated and created data.

Support cutover planning and execution from a data perspective.

Provide post-go-live support and stabilization.

Requirements: 5 years of SAP functional experience with a strong data focus.

Hands-on experience with SAP S/4HANA (greenfield preferred).

Proven involvement in large-scale, global ERP implementations.

Deep understanding of value stream business processes and related data objects.

Experience supporting data migration, cleansing, and validation.

Required Skills: Strong knowledge of SAP master data objects (e.g., Material, Vendor/Business Partner, BOM, Routings, Pricing, Customer, etc.).

Understanding of S/4HANA data model changes vs.

ECC.

Experience working with SAP MDG or similar governance tools preferred.

Familiarity with data migration tools (e.g., SAP Migration Cockpit, LVM, ETL tools).

Ability to read and interpret functional specs and data models.

Strong stakeholder management and communication skills.

Ability to work across global, cross-functional teams.

Detail-oriented with strong analytical and problem-solving skills.

Comfortable operating in a fast-paced transformation environment.

Preferred Skills: Experience in manufacturing, building materials, or asset-intensive industries.

Prior role as Functional Data Lead or Data Domain Lead.

Experience defining global templates and harmonized data models.

Knowledge of data quality tools and metrics.

Experience with MGD and setting up cost center and profit center groups.
Not Specified
Data Quality Analyst / Data Steward
✦ New
Salary not disclosed
Montgomery 1 day ago
Job Requisition: Data Quality Analyst / Data Steward Contract Length: Long Term – Potential renewal each fiscal year Work Location: 100% onsite – Montgomery, AL Candidate Profile Experienced data professional capable of building, advancing, and scaling data quality and governance foundations from scratch.

Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.

Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.

Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.

Drive long term improvement of data standards, definitions, lineage, and quality processes.

Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.

Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.

Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.

Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.

Conduct root cause analysis and implement preventive and long term remediation solutions.

Optimize SQL queries, tune stored procedures, and improve data processing performance.

Document audit findings, validation processes, data flows, standards, and quality reports.

Build dashboards and reports for data quality KPIs using Power BI/Tableau.

Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.

Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.

Ensure proper and consistent data usage across departments and systems.

Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.

Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.

Provide training on data entry, data handling, stewardship practices, and data literacy.

Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.

GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.

Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.

Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.

Drive adoption of data quality and governance practices across business and technical teams.

Support long term evolution of enterprise data strategy and governance maturity.

Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.

SSIS development, deployment, and troubleshooting.

Data profiling, validation rule design, quality scoring, and measurement techniques.

ETL/ELT pipeline design, debugging, and optimization.

Data modeling (conceptual, logical, physical).

Metadata management and lineage documentation.

Reporting and dashboarding with Power BI, Tableau, or similar tools.

Strong documentation and communication skills.

Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.

Experience in low maturity/green field data environments.

Familiarity with AI/ML data readiness and feature store aligned data structuring.

Cloud data engineering exposure (Azure, Databricks, GCP).

Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.

Master’s degree preferred.

Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Not Specified
jobs by JobLookup
✓ All jobs loaded