Prometheus Sql Examples Jobs in Usa
1,726 positions found — Page 8
Loloi Rugs is a leading textile brand that designs and crafts rugs, pillows, and throws for the thoughtfully layered home. Family-owned and led since 2004, Loloi is growing more quickly than ever. To date, we’ve expanded our diverse team to hundreds of employees, invested in multiple distribution facilities, introduced thousands of products, and earned the respect and business of retailers and designers worldwide. A testament to our products and our team, Loloi has earned the ARTS Award for “Best Rug Manufacturer” in 2010, 2011, 2015, 2016, 2018, 2023, and 2025.
Security Advisory: Beware of Frauds
Protect yourself from potential fraud and verify the authenticity of any job offer you receive from Loloi. Rest assured that we never request payment or demand any sensitive personal information, such as bank details or social security numbers, at any stage of the recruiting process. To ensure genuine communication, our recruiters will solely reach out to applicants using an @ email address. Your security is of paramount importance to us at Loloi, and we are committed to maintaining a safe and trustworthy hiring experience for all candidates.
We are building a Business Operations Center of Excellence, and we need a Product Data Analyst to serve as the "Guardian of the Golden Record." In this role, you are the absolute owner of product data integrity as it relates to the digital customer experience. You ensure that every item we sell is accurately represented across every touchpoint—from our ERP and PIM to our website storefront and marketing feeds. This is not a data entry role; it is a high-impact technical logic and investigation role. You will work directly with our Data Platform and Software Engineering teams to define business rules, audit data health via complex SQL, and troubleshoot data transmission errors before they impact the customer.
Responsibilities
- Storefront Governance: Serve as the absolute owner of product data integrity within the PIM. Ensure that all storefront-critical attributes (pricing, dimensions, weights, image links) are accurate and standardized for a seamless customer experience.
- Technical Data Auditing: Write and run complex SQL queries against our centralized database to identify anomalies, "orphan" records, and data hygiene issues that need resolution. You will be expected to query across multiple schemas to validate data consistency between systems.
- Feed Logic & Mapping: You will manage the logic of how data translates from our PIM to external endpoints. You will ensure that our products appear correctly on Google Shopping, Meta, Amazon, and other marketplaces by managing feed rules and mapping definitions.
- API Payload Analysis: You will act as the first line of defense for data transmission errors. If a product isn't showing up on the site, you will review the JSON/XML response bodies to determine if it is a data payload error or a software code bug.
- Cross-Functional Impact Analysis: You will act as the gatekeeper for data changes, predicting downstream impacts (e.g., "If Merchandising changes this Category Name, it will break the Finance reporting filter").
- Hygiene Logic Definition: You will partner with our IT/Database team to define automated health checks. You identify the "rot" (bad data patterns), and they implement the database constraints to stop it.
What You Will NOT Do (The Boundaries)
- No Web Development: You are not a Front-End Developer. You do not write HTML, CSS, or React code. You ensure the data powering those components is 100% accurate.
- No Manual Data Entry: Your job is not to copy-paste descriptions. You build the systems, bulk processes, and logic that ensure data quality at scale.
- No Database Administration: You do not manage server uptime or schema changes (IT owns this). You own the quality of the records inside the database.
Intersection with Technical Teams
- With IT (Database Mgmt): IT owns the infrastructure and schema; you own the quality of the data within it. When you identify a systemic issue (e.g., "5,000 orphan records"), you partner with IT to implement the technical fix (scripts/constraints).
- With Software Engineering (Commerce): If a product is missing from the site, you check the data payload. If the data is correct, you hand off to Engineering, confirming it is a code/caching bug rather than a data error.
Experience, Skills, & Ability Requirements
- 5-8 years of experience in Data Management, PIM Administration, or technical eCommerce Operations.
- SQL Proficiency: You are comfortable writing queries beyond simple SELECT *. You should be proficient with CTEs (Common Table Expressions), Window Functions (e.g., Rank, Lead/Lag), Subqueries, and complex Joins to act as a forensic data investigator.
- API Fluency: You can read and understand JSON and XML. You know what a valid payload looks like and can spot formatting errors or missing keys.
- Data Manipulation: You are an expert at handling large datasets (CSVs, Excel) and understand data types, formatting standards, and normalization concepts.
- You love hunting down the root cause of an error. You don't just fix the wrong price; you find out why the price was wrong and build a rule to stop it from happening again.
- You have high standards for accuracy. You understand that a wrong weight in the system means a financial loss on shipping for the business.
Bonus Points (Nice-to-Haves)
- Familiarity with Visio/Lucidchart to visualize data flows.
- Ability to build simple dashboards in Tableau to track data health scores.
- Basic familiarity with Python or R for data manipulation.
What We Offer
- Health, dental, and vision benefits
- Paid parental leave
- 401(k) with employer match
- A culture of meritocracy that fosters ongoing growth opportunities
- A stable, growing family-owned company that looks after its employees
Loloi Rugs does not discriminate on the basis of race, sex, color, religion, age, national origin, marital status, disability, veteran status, genetic information, sexual orientation, gender identity or any other reason prohibited by law in provision of employment opportunities and benefits. We seek a diverse pool of applicants and consider all qualified candidates regardless of race, ancestry, color, gender identity or expression, sexual orientation, religion, national origin, citizenship, disability, Veteran status, marital status, or any other protected status. If you have a special need or disability that requires accommodation, please let us know.
Job Title: Statistical Programming Analyst
Location: Columbia, SC
Contract duration: 6 Month contract with potential for extension or conversion
Job Summary
We are seeking a Statistical Programming Analyst III to join our Research & Analysis team in a partially onsite role, responsible for developing and delivering data-driven reports using large healthcare datasets. This role focuses on ensuring data integrity, executing routine monthly reporting processes, and creating ad hoc analyses to meet customer needs through statistical programming and data interpretation using tools such as SAS, SQL, and/or Python, with a transition toward Python and expanded data visualization capabilities.
Key Job Responsibilities:
- Ensure data integrity by validating and supporting monthly data warehouse table loads
- Execute standard and routine reporting processes using updated datasets
- Develop and deliver recurring and ad hoc reports based on customer requirements
- Use statistical programming (SAS, SQL, and/or Python) to extract, manipulate, and analyze large healthcare datasets
- Perform mathematical computations and data interpretation to generate meaningful insights
- Collaborate with internal teams, external vendors, and CMS stakeholders to understand reporting needs
- Translate technical findings into clear, concise reports and documentation for both technical and non-technical audiences
- Create and maintain detailed documentation for reporting processes and outputs
- Support the transition of existing processes to Python and contribute to enhancements in data visualization, dashboarding, and modeling
- Adapt to evolving tools, systems, and requirements within a dynamic contract environment
Job Qualifications:
- Bachelor’s or Master’s degree in Statistics, Biostatistics, Mathematics, Computer Science, or a related field
- Minimum of 4 years of experience in statistical programming or statistical data interpretation
- Strong experience with statistical programming and reporting (SAS, SQL, and/or Python preferred)
- Proficiency with Microsoft Office applications
- Experience working with relational databases and large datasets
- Ability to perform mathematical computations and analyze complex data
- Strong written and verbal communication skills, including the ability to explain technical concepts to non-technical audiences
- Experience creating clear, detailed documentation for reports and processes
- Ability to work with multiple stakeholders, including external vendors and CMS
- Adaptability to changing tools, technologies, and processes (including transition to Python and new visualization tools)
- Self-motivated, able to work independently, and comfortable solving complex problems with limited direction
- Must meet CMS security clearance and U.S. residency requirements (3 of the last 5 years in the U.S.)
Screening questions:
Do you now or in the future require sponsorship (e.g. H-1B)? Y/N
EEO and ADA Statement:
Consulting Solutions and its family of companies is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, disability status, genetic information, protected veteran status, or any other characteristic protected by law.
If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at:
Title: Senior Data Analyst
Duration: Long term
Location: Dallas , TX
Job Description:
Primary responsibilities of the Senior Data Analyst include supporting and analyzing data anomalies for multiple environments including but not limited to Data Warehouse, ODS, Data Replication/ETL Data Management initiatives. The candidate will be in a supporting role and will work closely with Business, DBA, ETL and Data Management team providing analysis and support for complex Data related initiatives. This individual will also be responsible for assisting in initial setup and on-going documentation/configuration related to Data Governance and Master Data Management solutions. This candidate must have a passion for data, along with good SQL, analytical and communication skills.
Responsibilities
- Investigate and Analyze data anomalies and data issues reported by Business
- Work with ETL, Replication and DBA teams to determine data transformations, data movement and derivations and document accordingly
- Work with support teams to ensure consistent and pro-active support methodologies are adhered to for all aspects of data movements and data transformations
- Assist in break fix and production validation as it relates to data derivations, replication and structures
- Assist in configuration and on-going setup of Data Virtualization and Master Data Management tools
- Assist in keeping documentation up to date as it relates to Data Standardization definitions, Data Dictionary and Data Lineage
- Gather information from various Sources and interpret Patterns and Trends
- Ability to work in a team-oriented, fast-paced agile environment managing multiple
- priorities
Qualifications
- 4+ years of SQL experience working in OLTP, Data Warehouse and Big Data databases
- 4+ years of experience working with Exadata and SQL Server databases
- 4+ years in a Data Analyst role
- Strong attention to Detail
- 2+ years writing medium to complex stored procedures a plus
- Ability to collaborate effectively and work as part of a team
- Extensive background in writing complex queries
- Extensive working knowledge of all aspects of Data Movement and Processing, including ETL, API, OLAP and best practices for data tracking
- Good Communication skills
- Self-Motivated
- Works well in a team environment
- Denodo Experience a plus
- Master Data Management a plus
- Big Data Experience a plus (Hadoop, MongoDB)
- Postgres and Cloud Experience a plus
Healthcare Business Intelligence & Analytics Analyst -Information Technology
Location:
620 Foster Avenue Brooklyn, NY 11230
Hours:
Full Time
Premium Health Center, a rapidly growing FQHC in Brooklyn, is seeking a detail-oriented and analytical Business Intelligence (BI) Analyst to join our growing Data & Analytics team. This role blends data analysis with light data engineering to build robust data pipelines, deliver actionable insights, and create high-quality reporting and analytics. The BI Analyst will play a key role in transforming raw data into actionable insights that will directly inform strategic, clinical, operational, and financial decisions across the organization.
Time Commitment:
Full Time, Hybrid Eligible
Responsibilities:
Analytics, Visualization & Storytelling
· Design, develop, and maintain dashboards, reports, and data visualizations in Power BI (or similar tools)
· Apply data visualization and storytelling best practices to create intuitive, user-friendly dashboards.
· Translate complex healthcare data into clear, actionable insights that support decisions for clinical, operational, finance, and executive teams.
· Develop and maintain semantic data models, KPIs, and performance metrics aligned with FQHC goals.
· Collaborate with stakeholders to gather requirements and recommend effective analytical and visual solutions.
· Analyze healthcare data from EHR systems (e.g.,eClinicalWorks, Office Practicum, etc) and other sources to identify trends, gaps, and opportunities for improvement.
· Support UDS (Uniform Data System) reporting and other regulatory compliance requirements.
· Create sustainable reporting frameworks for recurring healthcare and operational metrics.
Data Engineering &Pipeline Support
· Build and maintain light ETL and data integration tasks using SQL, APIs, and scripting tools.
· Write and optimize SQL queries to support analysis, dashboards, and data pipelines.
· Perform data wrangling, cleaning, validation, and transformation to prepare datasets for analysis and reporting.
· Ensure data integrity, accuracy, and security in all reporting and data engineering workflows.
· Perform data validation, reconciliation, and root-cause analysis for data quality issues.
Collaboration and Data Literacy
· Collaborate with clinical, operational, and executive teams to understand business needs and translate them into technical solutions.
· Provide training, documentation, and support to improve data literacy and promote appropriate self-service use of organizational dashboards.
· Collaborate with IT and data teams on architecture, governance, and data quality initiatives.
Requirements:
· Bachelor's degree in Data Science, Public Health, Health Informatics, Computer Science, ora related field.
· 4+years of experience in a BI, data analyst, or similar role, preferably in a healthcare or FQHC setting.
· Strong proficiency in SQL, including complex joins, window functions, and data transformations
· Hands-on experience with Power BI, or similar BI platform, including DAX, data modeling, and visualization design.
· Experience working with scripting languages (Python, R, etc) and APIs to support data integration and automation.
· Experience with semantic data modeling in Power BI.
· Strong analytical, critical thinking, and problem-solving skills.
· Excellent communication and data storytelling skills with the proven ability to present insights to non-technical audiences.
· Detail oriented with strong data troubleshooting and validation skills.
· Highly organized, with the ability to manage multiple tasks and deadlines.
· Self-starter who works independently and collaboratively.
· Ability to partner cross-functionally across clinical, operational, financial, IT, and data teams.
· Fast learner with adaptability to evolving tools and organizational needs.
· Strong commitment to high standards of data quality, accuracy, and confidentiality.
· Familiarity with HIPAA or other similar data privacy standards.
Preferred:
· Experience with Microsoft Azure, Fabric, Purview, or similar cloud platforms.
· Experience with Power Automate or similar tool for basic workflow automation.
· Familiarity with Git or similar version control tools.
· Experience with EHR systems (eCW, Office Practicum, etc,).
· Understanding of healthcare data, including clinical, operational, and financial metrics.
· Experience with UDS reporting or other healthcare regulatory or quality metrics.
Compensation:
$110,000 - $145,000, commensurate with experience
Benefits:
· Medical, Dental, Vision and Life coverage
· Paid Time Off and holidays
· Employee Assistance Program
· Flexible spending account
· Public Service Loan Forgiveness (PSLF), NHSC Loan Repayment Program
· 403(b) Retirement Plans with employer matching
Data Analyst - Transportation
Overview:
This position is for a Data Analyst who can manage Supply Chain’s SQL server data, Big Query data warehouse, Alteryx workflow (ETL) and Tableau Dashboard development. The candidate should be an excellent Alteryx and Tableau modeler who can build from requirement documentation with limited support.
Responsibilities:
- Performs advanced business analysis using various techniques, e.g. statistical analysis, explanatory and predictive modeling, data mining
- Create informative reports, dashboards, visualizations, and presentations to communicate data findings to stakeholders, translating complex information into actionable insights and opportunities for improvement
- Works closely with the internal or external client to understand available data elements, where they reside across various systems.
- Manages, architects, and automates the flow of information from various source systems through the supply chain SQL Server and ultimately into the Supply Chain Data Warehouse, ensuring clean, harmonized, and accurate data.
- Ability to work and communicate with business stakeholders and technical teams while bridging the communication gap between the two.
- Ensure a high degree of accuracy and timeliness in generating reporting outcomes for business leaders, including near-time operational reporting.
- Conduct insightful ad hoc analyses to investigate ongoing or one-time operational issues, including key Transportation initiatives.
- This role will also be critical in managing and integrating financial, operational, and transactional data to support our Transportation and Distribution Center (DC) network—comprising 9 DCs and handling approximately 500+ million annual case transactions.
- Support the managing OpEx & CapEx supply chain budget, including cost optimization and ad hoc financial tracking.
- Other duties as assigned.
Qualifications:
- Technical expertise in SQL, Alteryx, Tableau, Power BI, and Python.
- Strong knowledge of and experience working with data stores, data extraction & cleansing, analyzing large volumes of data using analytical tools, and in building data models.
- Knowledge of statistics and experience analyzing large datasets.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Advanced working knowledge of Microsoft PowerPoint, MS Excel
- Technical expertise IBM (Financial), OTM (Transport management Systems), Catalyst (Warehouse Management Systems)
Education & Experience:
- Bachelor's Degree - Data Science, Engineering or Mathematics
- Master’s degree preferred.
- 3-5 years of experience working with the data required (preferably in a data warehouse, data management team, or repository)
- Project management, business, practice, and people development experience
- Expert knowledge of business intelligence reporting solutions (Tableau, MicroStrategy, Business Objects, etc.)
Manage and optimize OpenShift deployments to support Artificial Intelligence (AI) and data-related solutions on Cloud Pak for Data.
Implement and maintain internal Watson OpenScale to monitor and interpret AI models performance in in support of customers' AI and machine learning objectives.
Leverage internal Cloud Pak along with Studio and components to manage data, perform analytics, and enhance AI capabilities.
Configure and use additional cartridges such as DataStage or Db2 to extend Cloud Pak for Data functionalities.
Develop and manage containerized applications and services with OpenShift on Cloud Paks to improve deployment efficiency, scalability, and application robustness.
Advise customers on applying AI Operations practices to ensure reliable and efficient AI system operations.
Optimize generative AI models and algorithms for better performance, accuracy, and confidence or ROUGE score optimization.
Design, develop, and implement AI solutions tailored to customer needs.
Engage with client executives to understand their requirements and provide suitable AI and data solutions and strategies.
Create and present tailored solutions that address client needs using the mentioned technologies.
Continuously monitor AI model performance and make necessary adjustments while ensuring compliance with security standards, particularly in the financial services sector.
Utilize: Python, Machine Learning, Pandas, NumPy, Scikit-learn, SQL.
Required: Bachelor's degree or equivalent in Computer Science, Data Science, Engineering, Information Systems, Mathematics or related (employer will accept Associates degree plus two (2) years of IT experience in lieu of a Bachelor's degree) and two (2) years of experience as an Analyst, Technical Specialist or related.
Two (2) years of experience must include utilizing Python, Machine Learning, Pandas, NumPy, Scikit-learn, SQL.
$199998
- $225000 per year.
Please send resumes to
Applicants must reference H270 in the subject line.
JobiqoTJN.
Keywords: Client Services Manager, Location: Armonk, NY
- 10504
Fisher Investment Europe's Global Marketing Group is the lead generation engine for the European Private Client Group, and the Marketing Data & Analytics Team plays a critical role in that process. In this position, you will help guide key decisions while assessing performance.
As a Marketing Data & Analytics Analyst, you will help maintain the flow of data through the department, provide timely analysis and reporting to all levels of Marketing stakeholders, and assist in ad hoc requests and long-term developments. Utilizing skills in SQL, Excel, VBA, PowerBI, and more, you will build reporting and hone your project management skills to support infrastructure improvements. You will also learn how to critically assess problems and opportunities to improve bottom-line results.
Performance will be judged on the ability to solve problems, communicate, and assist internal clients and all other teams in Global PCG Marketing.
The Day-to-Day:
* Work with Marketing management to support all phases of Marketing efforts
* Build and maintain daily reporting for global Marketing teams
* Build and automate new and existing processes
* Understanding the "why" and "how" of department data flow
* Communicate clearly by distilling information down for a non-technical audience
* Provide data-driven analysis and insights to optimize campaign performance
* Collaborate with other analysts, Marketing managers, and database developers on both strategic initiatives and ongoing data infrastructure enhancement projects
* Manage ad-hoc data requests to help meet a variety of needs such as troubleshooting data oddities, QA'ing, modeling, and more
Your Qualifications:
* 2+ years Marketing Data Analytics experience
* Strong proficiency in SQL, Excel, PowerBI, and relational databases
* Attention to detail and a history of managing complexity
* Demonstrated leadership and self-direction; capacity for learning new skills and a willingness to share knowledge and teach others
* Ability to communicate both technical and non-technical insights to all levels of management
* Strong project management skills
Why Fisher Investments Europe:
The global Fisher organisation distinguishes itself by putting clients first, providing unmatched service, and taking a personalised approach to investing. You can feel confident knowing that we align with our clients' best interests by using a simple and transparent fee structure and recognised European custodians.
It's the people that make the Fisher purpose possible, and to help our employees meet their long-term goals, we offer an array of benefits, including:
* 100% paid premiums for our top-tier supplemental medical, dental and annual health screening plans for employees and their qualified dependents
* 28 days annual leave, with the ability to purchase up to 3 additional days per year, plus up to 8 paid holidays
* Enhanced maternity pay package with 16 weeks' top up to full base pay for eligible employees
* $10,000* fertility, hormonal health and family-forming benefit
* A retirement pension plan, featuring a 9% company contribution of base pay with an additional company match of up to 5% of base pay on personal contributions
* Gym subsidy of up to £50 per month
* Employee Assistance Program and other emotional wellbeing services
* A collaborative working environment that practises ongoing training, educational support and employee appreciation events
* This is an in-office role. Based on your role, tenure, and performance eligibility you may have the opportunity to participate in our hybrid work from home program. This program is subject to change.
*Employees residing outside of the US will be eligible for the $10,000 equivalent in their local currency.
FISHER INVESTMENTS EUROPE IS AN EQUAL OPPORTUNITY EMPLOYER
Analyze, design, configure, maintain, test, troubleshoot, and implement changes to custom and package applications.
Learn and understand business processes associated with supported applications.
Develop technical designs that meet business needs and support the companys IT direction.
Properly leverage the correct technology for defined requirements.
Create and execute comprehensive unit test plans, develop test cases, assist with integration, and complete system test plans.
Develop clear documentation for completed products.
Investigate and resolve problems with supported applications in a production support environment.
Assist with coordination of software packaging requests to support Asset Management.
Solve complex problems and troubleshoot functionality for issue identification.
Learn electric utility transmission and distribution related business processes.
Become proficient in software development on supported applications and related tools.
Consistently deliver high-quality results.
Manage source code repositories, builds, and deploys.
Actively participate in a collaborative work environment.
Availability to work flexible hours to support our application portfolio.
Interact frequently with business owners/stakeholders to gather requirements for new development projects and assist in defect resolution.
Requirements: Experience with relational databases and strong SQL skills, including writing procedures, triggers, and jobs.
Experience with Microsoft Server configuration/administration (IIS, Services, Tasks).
Experience with Asset Management products such as DNV Cascade, Doble Powerbase, Hitachi Asset Performance Monitor (APM), and SEL Compass is a strong plus.
Knowledge of Linux/UNIX operating systems is a plus.
Experience with Kubernetes clusters and application deployments is a plus.
Integration experience is a strong plus.
Analytical skills to resolve problems and think creatively.
Willingness to undertake assignments involving unfamiliar subjects, with the aptitude to learn quickly.
Ability to learn new concepts in information technology and update skills to adapt to changing technology.
Good interpersonal skills and ability to work effectively as part of a team.
Highly motivated to work independently and productively in a virtual environment.
Ability to meet established priorities and schedules, and handle multiple tasks.
Demonstrated ability to provide software solutions and support the entire software development life cycle.
Proven ability to troubleshoot and solve problems in a production support environment.
Demonstrated ability to effectively prioritize and plan work, handle multiple concurrent tasks, and meet deadlines.
Required Skills: Relational databases and strong SQL skills.
Microsoft Server configuration/administration.
Analytical and problem-solving skills.
Interpersonal and team collaboration skills.
Ability to work independently and in a virtual environment.
Preferred Skills: Experience with Asset Management products.
Knowledge of Linux/UNIX operating systems.
Experience with Kubernetes clusters and application deployments.
Integration experience.
- Onsite 24 Months Pay: $38.57 per hour Position Summary: This role supports client’s Vegetation and Program Management teams by developing and maintaining financial and operational reports, dashboards, and analytics.
The position ensures accurate budget tracking, performance monitoring, and executive reporting to support reliability, compliance, and strategic decision-making.
Key Responsibilities • Develop and maintain financial and program performance reports and dashboards.
• Track budgets, forecasts, and expenditures for vegetation and infrastructure programs.
• Perform variance and trend analysis.
• Extract, validate, and reconcile data from enterprise systems.
• Utilize SAP, Power BI, Databricks, SQL, and Python to support reporting and analytics.
• Support management, regulatory, and audit reporting.
• Automate and improve reporting processes.
• Provide ad hoc analysis for leadership and business partners.
Required Qualifications • Bachelor’s degree in finance, Business Analytics, Information Systems, or related field.
• 3+ years of experience in financial analysis or reporting.
• Advanced Excel skills.
• Experience using SAP, Power BI, Databricks, SQL, and Python.
• Strong analytical, organizational, and communication skills Preferred Qualifications • Utility or regulated industry experience.
• Experience supporting large operational or capital programs.
Work Environment • Full-time, in-office position.
• Fast-paced, deadline-driven environment with high visibility.
• Work cross-functionally across departments to drive success • Shared responsibility for outcomes.
Responsible for design, development, and implementation of ESS portions of OBBA Tax Bill (IRS TY2025 Tax Law components) and Reporting Project.
Collaborate closely with the project manager, end users, application development, and product owner team members.
Accomplish design, coding, testing, and implementation of new applications to satisfy business needs.
Requirements: Minimum of seven (7) years of experience with multiple versions of PeopleSoft and People Tools across HCM modules.
Required Skills: Extensive experience and understanding of PeopleSoft HCM applications (v 9.2 PUM 39 with fluid technology) and PeopleSoft Tools (v 8.61).
Implementation, design, and customization experience for PeopleSoft Tools Upgrades.
Experience in data/code analysis, creation and review of technical designs, and documentation from functional design, coding, unit testing, and debugging various PeopleSoft HCM modules.
Experience in customizing COBOL stored procedures.
Programming experience using Appeasing, PeopleCode, Application Package, Application engine, Workflow, Integration Broker, and SQR.
Experience in creating/modifying data conversion scripts.
Experience developing complex interfaces into legacy and 3rd party systems.
Proficient with SQL and PS/Query for reviewing, troubleshooting, and testing/validating source system data.
Strong analytical and problem-solving skills.
Strong knowledge of STAT Tool, Oracle, and SQL Server.
Excellent communication skills (oral and written), interpersonal, and organizational skills.
Preferred Skills: Knowledge and extensive experience with Oracle Fluid technology.
Extensive experience using Web Services.
Experience with enterprise-wide, large-scale implementations.
Experience with CA Scheduler Tool.
Education: Bachelor's or Master's Degree in Computer Science, Engineering, or in a Technical/Business Discipline is required.