With Query Example Sql Jobs Remote Jobs in Usa
290 positions found — Page 4
About the Job:
The Applied Analytics Analyst serves as the technical anchor of the Strategic Resource Group. This role is responsible for translating complex business and market questions into structured, executable data outputs using Trilliant Health’s proprietary claims, provider directory, and price transparency datasets.
The Applied Analytics Analyst owns feasibility validation, analytical methodology design, and data integrity across research initiatives and pre-sales support. This individual combines strong technical proficiency with healthcare domain expertise and plays a critical role in standardizing how recurring strategic questions are answered across the organization.
You are our ideal candidate if you:
- Design and execute complex SQL queries and data builds from Trilliant’s data warehouse
- Capture and maintain documentation outlining how and why analytical frameworks are applied to support consistency and institutional knowledge retention
- Validate data integrity and identify gaps, missingness, structural limitations, or edge cases
- Own technical feasibility assessments for research and pre-sales opportunities
- Develop repeatable analytical frameworks for common strategic use cases
- Support research initiatives through structured dataset construction and methodological validation
- Create reusable datasets, templates, and documentation to reduce institutional knowledge concentration
- Maintain high standards of quality control and analytical rigor across all deliverables
- Interface effectively with Sales, SRG, Research, Product, and Data Engineering teams
- Respond to ambiguity with structured problem solving and professional judgment
Technical Skills:
- Advanced proficiency in SQL and experience querying large data warehouses
- Experience working in Databricks or similar environments preferred
- Strong proficiency in Excel and PowerPoint
- Familiarity with Tableau or other BI tools
- Experience working with complex healthcare claims datasets required
Other Skills:
- Strong analytical and critical thinking skills
- Ability to synthesize large datasets into structured outputs
- Excellent documentation and organizational skills
- Strong written and verbal communication skills
- Ability to work independently with minimal supervision
- High attention to detail and commitment to data quality
Position Location:
This position is onsite in Brentwood, TN
*We are unable to provide visa sponsorships for this role.
About Trilliant Health:
Trilliant Health is a high-growth, healthcare technology company. We are on a mission to be the most trusted advisor, dependable partner and provider of analytic insights to key stakeholders in the health economy enabling them to maximize return on invested capital. We do that by providing education and expertise through thought leadership, evidence-based strategy, and predictive analytics. We are looking to grow our team as we strive to influence positive change in healthcare by disrupting the status quo and promoting improved decision-making.
Randstad is presently conducting a search for a Sr. Financial Reporting Analyst for a well-established, progressive, and rapidly growing healthcare organization. Organization offers competitive benefits, opportunities for professional development, collaborative working enviornment, and top-notch leadership. The role will report to the VP, Finance.
The BI Analyst is responsible for delivering accurate, timely, and standardized
workforce and productivity reporting to finance, operations, and executive leadership. This role serves as the primary owner of SQL based data queries and report production, translating workforce data into clear Excel outputs and dashboards that support operational and financial decision making. This position focuses on descriptive and diagnostic reporting, not predictive modeling or data science.
This role provides foundational workforce reporting infrastructure, improves turnaround time for
leadership insights, and ensures advanced analytics resources are focused on higher value strategic work rather than routine reporting.
Required Qualifications
- Bachelor’s degree in Analytics, Finance, Information Systems, Statistics, or related field
- 5 + years of relative experience
- Demonstrated experience writing SQL queries against relational databases.
- Advanced proficiency in Microsoft Excel (pivot tables, formulas, data validation).
- Experience producing operational or workforce reports for business leaders.
- Experience with Power BI, Tableau, or SSRS.
For immediate and confidential consideration, please email resume to Chip Doshi,
Title: Specialist I, Logistics Data
Job Summary: The Logistics Data Specialist is responsible for managing logistics master data, validating transactional accuracy, and delivering analytics that support transportation planning, customs execution, warehousing, and freight settlement. This role partners with Operations, Procurement, Trade Compliance, and Finance to ensure information reliability and actionable reporting.
Responsibilities include:
- Maintain carriers, lanes, rates, BOMs, HTS, and partner master data in TMS/WMS/SAP.
- Perform audits on shipments tracking milestones, POD, cost allocation, and accrual triggers.
- Identify root causes of data discrepancies and implement corrective actions.
- Build SOPs for data entry, validation logic, and exception handling
- Develop dashboards for OTIF, GIT, transit time, freight spend, accessorial, claims, and capacity utilization.
- Provide weekly/monthly KPI packs to operations leadership.
- Support budget vs. actual analysis and PR forecast modeling.
- Translate business requirements into SQL/BI outputs.
- Validate rating, fuel, and accessorial charges.
- Support three-way match among PO, shipment, and invoice.
- Prepare accrual and variance reports.
- Assist audit requests from Finance
- Act as super-user for TMS/WMS modules.
- Drive automation to reduce manual work
- Work with transportation, warehouse, procurement, and customs teams to improve data transparency.
- Provide data analysis for RFPs, network optimization, and vendor reviews
- All other duties as assigned
Qualifications:
- Bachelor’s degree in supply chain, Logistics, Business Analytics, or related discipline
- 2+ years in logistics, transportation analytics, or supply chain systems.
- Experience working with freight invoices, carrier data, or brokerage information is highly valued
- Advanced Excel (pivot tables, power query, xlookups).
- SQL or similar database querying.
- BI tools such as Power BI, Tableau, or Looker.
- Familiarity with SAP/TMS/WMS environments (e.g., SAP, Oracle, MercuryGate, etc.).
- Strong analytical reasoning.
- High attention to detail.
- Comfortable in fast-moving, build-phase environments.
Physical Requirements and Working Conditions
- Ability to sit for extended periods while working at a computer
- Frequent use of hands and fingers for typing, filing, and operating office equipment
- Occasional standing, walking, bending, and reaching
- Ability to lift and carry light office materials (up to 10–15 lbs.), such as files or office supplies
- Visual acuity to read screens, documents, and reports
- Ability to attend meetings and interact with employees, clients, and vendors
The Operations Data Analyst will play a crucial role in applying data analysis and reporting to support operational performance, production scheduling, and process improvements. This position will work closely with operations teams to assist in scheduling tasks as needed, while also owning analytics processes, reporting, and data-driven insights that improve productivity, quality, and throughput.
***This is 100% on-site, full-time position in Madison, IN. Hybrid or remote work is NOT available.
Qualifications
Technical Skills
- Strong SQL skills: ability to write complex queries, join large datasets, optimize performance, and produce reliable analytical outputs.
- Advanced Excel expertise: pivot tables, VLOOKUP/XLOOKUP, Power Query, macros/VBA a strong plus.
- Experience with data visualization tools (Power BI, Tableau, or similar) preferred.
- Comfortable working with large datasets and generating meaningful insights.
Production Experience
- Understanding of production planning, operations workflows, and scheduling concepts — ideally in a manufacturing/industrial environment.
- Prior work in supporting production operations with analytical tools or capacity planning.
Education
- Bachelor’s Degree preferred — Analytics, Industrial Engineering, Supply Chain, Data Science, Business Analytics, Mathematics, or related field.
- Equivalent experience with strong technical skills considered.
Preferred Qualifications
- Experience with manufacturing ERP/MRP systems.
- SQL Server, MySQL, PostgreSQL, or similar database experience.
- Familiarity with scheduling tools or production modules within ERP systems.
- Knowledge of Lean Manufacturing or Continuous Improvement methodologies.
***FOR IMMEDIATE CONSIDERATION, PLEASE SEND A COPY OF YOUR UPDATED RESUME.
At Genpact, we don’t just keep up with technology—we set the pace. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges.
If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what’s possible, this is your moment.
Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at   and on LinkedIn, X, YouTube, and Facebook.
Inviting applications for the role of Principal Consultant, Business Intelligence – II
Skills – Genpact LLC seeks Principal Consultant, Business Intelligence – II (multiple positions) in New York, NY to be responsible for the design, development, testing, optimization, and maintenance of the full range of Business Intelligence (BI) reporting solutions, including the analysis, design, development, and deployment of business intelligence data models, metadata files, dashboards, reports, and subject areas in tools such as OBIEE. Collaborate with the architecture and data analyst team to understand the business requirement and translating them into formal requirements and design documents, establish specific solutions and leading the efforts including programming and testing. Tune reports, loaders, table design, indexes, and stored procedures to deliver required performance. Design and develop complex SQL queries and PL/SQL stored procedures to support large management reporting requirements using Oracle Databases and for data validations. Employ new software technologies consistent with support requirements. Develop and maintain ETL processes that meet project-specific business and technical requirements and those of the overall enterprise data warehouse data integration portfolio. Guide and mentor organization with best practices in report and dashboard design and development. Assist in the ongoing development of technical best practices for data movement, data quality, data cleansing, and other ETL-related activities. Facilitate the development of a corporate data policy that outlines the appropriate data stewardship and data audit requirements, as well as the service-level agreements, to support the appropriate quality and use of information in business and IT processes. Employ Windows and Linux environments, Oracle SQL, PL/SQL, UNIX, shell scripting, Tableau, Informatica Power Center, and OBIA.
Education – Position requires a Bachelor's degree in an Engineering (all), Computer Science, Information Technology, Computer Information Systems, Business Administration, or related field and 5 years of progressively responsible post-Bachelor’s experience in the job offered or related occupation. Foreign degree equivalents are acceptable. Position headquartered in New York, NY with placement at project sites nationally within the United States with no additional travel required. $169,541- $178,018 per year.
Please send resume and cover letter to:
Indicate job code “GPCBIINY0226†when applying.
Why join Genpact?
Lead AI-first transformation – Build and scale AI solutions that redefine industries Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career—Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best – Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI – Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress
Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up.
Let’s build tomorrow together.
The approximate annual base compensation range for this position is $169,541- $178,018. The actual offer, reflecting the total compensation package plus benefits, will be determined by a number of factors which include but are not limited to the applicant’s experience, knowledge, skills, and abilities; geographic location; and internal equity
“Los Angeles, California based candidates are not eligible for this role.
Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation.
Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. JobiqoTJN. Keywords: Business Intelligence Expert, Location: New York, NY - 10060
OZ – Databricks Architect/ Senior Data Engineer
Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.
We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!
What We're Looking For:
We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.
This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.
Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.
Position Overview:
The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.
This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.
Key Responsibilities:
- Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
- Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
- DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
- Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
- Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
- GenAI Applications Development: It is a big plus to have experience in GenAI application development
Requirements:
- 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
- Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
- Strong programming skills in Python and SQL; experience with PySpark required.
- Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
- Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
- Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
- Strong understanding of data architecture, data modeling, and performance optimization.
- Experience working with cross-functional teams to deliver enterprise data solutions.
- Tackles complex data challenges, ensuring data quality and reliable delivery.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
- Experience designing enterprise-scale data platforms and modern data architectures.
- Experience with data integration tools such as Azure Data Factory or similar platforms.
- Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
- Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
- Databricks, Azure, or cloud certifications are preferred.
- Strong problem-solving, communication, and technical leadership skills.
Technical Proficiency in:
- Databricks, Apache Spark, PySpark, Delta Lake
- Python, SQL, Scala (preferred)
- Cloud platforms: Azure (preferred), AWS, or GCP
- Azure Data Factory, Kafka, and modern data integration tools
- Data warehousing: Databricks, Snowflake, or Azure Fabric
- DevOps tools: Git, Azure DevOps, CI/CD pipelines
- Data architecture, ETL/ELT design, and performance optimization
What You’re Looking For:
Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.
About Us:
OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.
OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.
Senior Platform Architect
Reports To: Director of Engineering
Department: Engineering
Location: Hybrid - Atlanta, GA
What makes MTech different:
Purpose-Driven Work – Build technology that solves real problems for the world
Casual & Collaborative – No corporate bureaucracy, direct access to senior leadership
Innovation-Focused – Healthy innovation pipeline expanding into new segments and technologies
Transparent & Data-Driven – Clear metrics, objectives, and visibility into company performance
Modern Development – Robust development tools, training programs, and technical excellence
Flexibility & Balance – Flexible work environment that values results over presenteeism
Job Summary
The Senior Platform Architect will lead the technical architecture, design, and modernization of large-scale, multi-tenant enterprise SaaS platforms built on Azure and the .NET stack. This role requires mastery of distributed systems, cloud-native design, and advanced engineering practices to deliver highly available, performant, and secure solutions for global consumer-facing SaaS and Agentic AI products.
Responsibilities and Duties
Architectural Design & Transformation
- Lead migration from monolithic systems to modular monolith and microservices architectures using domain-driven design, bounded contexts, and decomposition strategies.
- Design multi-tenant SaaS platforms with advanced tenant isolation, resource partitioning, and elastic scaling using Azure services.
- Define and enforce architectural standards for .NET (C#), TypeScript, Angular, SQL Server, and Azure, including dependency injection, SOLID principles, asynchronous programming, and reactive patterns.
- Design and implement distributed systems: service orchestration, API gateway management, IoT, edge computing, distributed transactions, eventual consistency, CQRS, and event sourcing.
- Architect for cloud-native resiliency: circuit breakers, bulkheads, retries, failover, geo-redundancy, and disaster recovery using Azure App Services, Azure Functions, Service Bus, Cosmos DB, and Azure SQL.
- Develop and maintain architecture documentation, reference models, and decision records using industry frameworks (TOGAF, Zachman, C4 Model).
Performance Engineering & Observability
- Establish and monitor platform SLOs (latency, throughput, error rates, availability) mapped to customer SLAs.
- Architect and implement advanced caching strategies, indexing, and query optimization for SQL Server and NoSQL stores in coordination with Senior Data Architect, Data Engineers, and Database Admins.
- Design and implement telemetry pipelines: distributed tracing (OpenTelemetry), structured logging, metrics collection, and real-time dashboards for system health and diagnostics.
- Conduct performance profiling, load testing, and capacity planning for backend services and frontend applications.
Automation, Quality, and DevOps
- Architect and implement CI/CD pipelines with automated build, test, security scanning, and deployment workflows.
- Integrate static code analysis, code coverage, and quality gates into the development lifecycle.
- Design and enforce automated testing strategies: unit, integration, contract, and end-to-end tests for backend and frontend components.
- Develop infrastructure as code (IaC) solutions for repeatable, scalable cloud provisioning.
- Create incident response playbooks for rollback, failover, and recovery, drive down MTTR and automate remediation where possible.
Security, Compliance, and Governance
- Architect for multi-tenant security: authentication/authorization (OAuth2, OpenID Connect), encryption at rest and in transit, secrets management, and compliance with SOC 1, SOC 2, GDPR, and other regulatory standards.
- Implement secure software development lifecycle (SSDLC) practices, threat modeling, and vulnerability management, including ZDR, DLP, No Model Training policies with AI Models.
- Ensure architectural governance and alignment with enterprise frameworks (TOGAF, Zachman), maintain architecture decision records, and participate in architecture review boards.
Technical Leadership & Collaboration
- Mentor engineering teams in advanced architectural concepts, distributed systems, cloud-native development, and best practices.
- Collaborate with Data Architect, DevOps, IT Services, Engineering and Product Management teams to ensure platform extensibility, integration, and support for complex business requirements.
- Evaluate and integrate AI/ML services, advanced analytics, and developer productivity tools to enhance platform capabilities.
- Champion a culture of technical excellence, continuous improvement, and innovation.
Required Experience & Skills
- Minimum 10+ years in software/platform engineering, with at least 8 years in platform architecture for enterprise SaaS on Azure and .NET tech stack.
- Proven experience architecting and delivering large-scale, multi-tenant SaaS platforms for global consumer-facing products.
- Deep expertise in .NET (C#), Azure cloud services (App Services, Functions, Service Bus, Cosmos DB, SQL Server), Azure Open AI, Microsoft Agent Framework, TypeScript, Angular, CI/CD, automated testing, and observability.
- Mastery of distributed systems, cloud-native patterns, event-driven architectures, and microservices.
- Demonstrated success in technical debt reduction, performance engineering, and architectural modernization.
- Experience with architectural frameworks (TOGAF, Zachman, C4 Model), architectural governance, and compliance.
- Strong understanding of platform security, regulatory compliance, and multi-tenant SaaS challenges.
Success Metrics (First 12 Months)
- Reduction in platform-related incidents/support tickets.
- Improvement in deployment speed and release velocity.
- Reduction in MTTR for platform incidents.
- Achievement of modularization milestones (monolith decomposition, service rollout, platform observability in production).
- Increase in automated test coverage, code quality, and system performance metrics.
Preferred Skills & Certifications
- TOGAF, Zachman, or similar architecture certification.
- Advanced knowledge of event sourcing, CQRS, service mesh, and cloud-native security.
- Familiarity with semantic technologies, knowledge graphs, and AI/ML integration.
- Hands-on experience with infrastructure as code, automated testing tools, and modern DevOps practices.
- Strong background in platform security, compliance, and multi-tenant SaaS challenges.
EEO Statement
Integrated into our shared values is MTech’s commitment to diversity and equal employment opportunity. All qualified applicants will receive consideration for employment without regard to sex, age, race, color, creed, religion, national origin, disability, sexual orientation, gender identity, veteran status, military service, genetic information, or any other characteristic or conduct protected by law. MTech aims to maintain a global inclusive workplace where every person is regarded fairly, appreciated for their uniqueness, advanced according to their accomplishments, and encouraged to fulfill their highest potential. We
Hi ,
This is Vamshi ,from Software Technology We have a job opening with our client for position DBA/DATA Architect If you are available and looking for any new opportunities, please send me your updated resume for below position ASAP.
Job Title: DBA/DATA Architect
Location: Denver, CO (Hybrid 3 days work form office)
Duration: Full Time/Longterm Contract
Must have skills: Architect level good; but should be hands-on with proactive driver mindset (individual contributor; no team management responsibility)
Skills To Be Evaluated On
Azure Data Factory, MS SQL Server DBA, Performance Tuning, Data Pipeline, Data Management
Technical Skills
- Azure Data Factory, Azure Data Lake Storage (ADLS), Azure Databricks, Azure SQL Database.
- Strong SQL, Python, Scala, and PySpark.
- Experience in building data models and schema design.
- Experience with SSIS
- Oracle DB experience would be a Plus
Roles & Responsibilities
- Design, build, and optimize data pipelines and ETL processes using Azure Data Factory and Databricks.
- Manage Azure SQL Database, including performance tuning, indexing, and query optimization.
- Design scalable data solutions, data lakes (ADLS), and data warehousing solutions.
- Ensure data security, quality, and integrity throughout the lifecycle.
- Monitor data pipelines and resolve performance issues or failures.
- Work with data scientists and analysts to support data-driven decision-making.
Thanks,
Vamshi Thangadpalli
Technical Recruiter
Email: | Web: :// Overlook Center, Suite 200
Princeton, NJ 08540.
Hi
Role: Oracle SOA Sr or Architect
Location: Atlanta, GA – Onsite – 5 days
Project Type: Contract
Preferably Architect – Else very senior consultant
Very solid on middleware technologies SOA/EDI/FTS etc.
NOTE:
They need a solid and very experienced Oracle SOA architect.
The person needs to be senior and needs to be able to work with multiple teams, manage clients, and able to handle all issues/tickets during US working hours.
Job description:
- Incumbent will be leading month/quarter/year-end financial closing meetings from IT side.
- He/She will be engaged in ongoing Oracle cloud migration project.
- Documenting all the re-occurring issues, finding the RCA and get them fixed permanently.
- Our team continually innovates to deliver digital solutions to support complex, dynamic operations.
Qualifications:
BS/B. Tech or MS degree or equivalent experience relevant to functional area
- Overall 10+ years of experience in IT.
- 8+ years of experience Oracle Cloud Fusion (Finance)
- Extensive working knowledge in Oracle Fusion financials Modules like General Ledger, Accounts Payable, Accounts Receivables, Cash Management and Fixed Assets modules.
- 5+ years’ experience in leading month end financial closings from IT side.
- Organizational and planning skills including scheduling
Detailed Job Description:
- Extensively working on Oracle Finance including conversions, migrations, client configuration, code customization, reports creation etc.
- Extensively worked on Interfaces, Conversions and Migrations.
- Worked extensively on Account Receivables, Account Payables, General Ledger.
- Experience in Requirement Gathering, Functional Studies, Testing, and Application Maintenance on Oracle EBS.
- Active participation in review and gap analyses in R12 Upgrade, functional/process requirement gathering, and technical design.
- Extensively worked on XML publisher reports and WEBADI.
- Experience in Handling Production Issues and solving defects.
- Adept in database languages SQL, PL/SQL and in writing and debugging queries, stored procedures, packages, views, triggers and functions using SQL and PL/SQL.
- Experience with various tools like Oracle Forms (Customizations and personalization) and Development of new XML Reports using different reporting tools like Oracle Report Builder, BI Publisher, PVCS, WINSCP.
- Constantly working towards developing skills in the latest emerging technologies.
- Good Communication skills, programming, problem solving and trouble-shooting skills.
Email:
About the Company
FCSLA Life is committed to providing exceptional service and support to our members. Our mission is to ensure that every member feels valued and understood, fostering a culture of inclusivity and respect.
About the Role
Experienced Microsoft Developer designs, develops, maintains and supports web-based and Windows applications. Strong expertise in C#, VB6, and Microsoft SQL Server, and a solid understanding of both modern and legacy systems. This role involves working closely with business stakeholders to enhance existing applications and build new solutions that meet evolving organizational needs.
Essential Functions
- Design, develop, and maintain Windows and web applications using Microsoft technologies
- Write clean, efficient, and well-documented code in C# and VB6
- Develop and optimize SQL Server databases, stored procedures, views and queries
- Maintain and modernize legacy VB6 Applications, including integration with newer systems
- Crystal Reports 10 experience and MS Access
- Collaborate with analysts, QA, and end users to gather requirements and deliver solutions
- Troubleshoot, debug, and resolve application and database issues
- Participate in code reviews and ensure adherence to development standards and best practices
- Support deployments, upgrades, and ongoing production maintenance
- Create technical documentation for applications and processes
- Resolving Help Desk issues
- All other duties as assigned
Education & Experience
- Four year degree or equivalent experience in computer science or related field
- Strong experience with C# (.NET Framework / .NET Core)
- Proven experience supporting and enhancing VB6 Applications
- Advanced knowledge of Microsoft SQL Server, including:
- T-SQL
- Stored procedures
- Performance turning and indexing
- Experience with web development (ASP.NET, MVC, Web APLs, or similar)
- Experience developing Windows applications (WinForms and/or WPF)
- Understanding of software development lifecycle (SDLC)
- Strong problem-solving and analytical skills
- Ability to work independently and collaboratively in a team environment
- Strong software development background and system management experience
- Proficiency with Microsoft Office Suite, Desktop PC and Calculator, Policy Management System (proprietary software for the main database), FormDocs and Fortis
Preferred Qualifications
- Experience migrating VB6 applications to .NET
- Familiarity with HTML, CSS, JavaScript
- Experience with Visual Studio, source control (Git, TFS, or similar)
- Knowledge of RESTful services and API integrations
- Experience in Agile or Scrum environments
Work Environment
This job is performed in a professional office environment. This is a full-time position with business hours Monday through Friday. Hours of work are typically 8:00 a.m. to 4:30 p.m. Additional hours may be worked as appropriate. Work is routinely performed using standard office equipment such as computers, phones and copiers, in a fast-paced environment.
Physical Demands
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. While performing duties of this job, the employee is regularly required to talk, hear, sit for long period of time, use of hands and fingers to keyboard, use of standard office equipment such as computers, phones and copiers, navigating about the office and viewing materials and equipment needed to perform required tasks. This position requires the ability to occasionally lift office products and supplies, up to 30 pounds. Work also requires ability to reach into top filing cabinet drawers and bend or stoop to reach into bottom filing cabinet drawers.
Travel
This position requires no travel.
Equal Opportunity Statement
FCSLA Life is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.