With Query Example Sql Jobs in Usa
1,528 positions found — Page 4
AGE Solutions is a premier technology and professional services company, providing in-depth consulting, advanced technology solutions, and essential services throughout the U.S. government, defense, and intelligence sectors. Prioritizing innovation and client-focused solutions, we assist major agencies in addressing intricate issues and ensuring a more secure future.
AGE is seeking a Software Developer with strong C# and database expertise to join our development team. This role focuses on building and maintaining robust middle-tier services and data-driven applications. The candidate will work closely with cross-functional teams to design, develop, maintain and optimize scalable backend systems that power critical business functionality.
The ideal candidate combines solid middle-tier development experience with deep database development knowledge and a strong understanding of system design, performance, maintainability, and data integrity.
This role is hybrid in Philadelphia, PA, requiring onsite reporting at the customer's facility at least 1 day/week. Candidates must reside within a commutable distance of Philadelphia, PA in order to work onsite as required.
Responsibilities Include:
- Design, develop, and maintain middle-tier services and backend components using C# and .NET technologies.
- Apply SOLID principles and clean architecture practices in application design.
- Build robust APIs and business logic layers to support web and enterprise applications.
- Collaborate with front-end developers, architects, and DevOps teams to deliver integrated solutions.
- Design, develop, optimize, and maintain relational databases (Oracle preferred).
- Write efficient stored procedures, views, functions, and complex queries.
- Optimize database performance, indexing strategies, and query tuning.
- Participate in code reviews and enforce best practices for clean, maintainable code.
- Troubleshoot and resolve production issues related to application logic and data layers.
- Contribute to architectural decisions and technical design discussions.
- Document technical designs and implementation details.
Required Skills, Qualifications and Experience:
- BA/BS in technical discipline.
- 10 years of experience in middle-tier and database development.
- Experience in applying SOLID principles and object-oriented design patterns.
- Strong proficiency in C# and .NET (.NET Core/.NET 6+) and ASP.NET Web API.
- Experience designing and building RESTful APIs and middle-tier services.
- Experience with ORM frameworks (Entity Framework preferred, Dapper).
- Strong SQL skills and experience with relational databases (Oracle preferred, SQL Server, PostgreSQL).
- Experience writing and optimizing complex queries and stored procedures.
- Strong understanding of data modeling and database design principles.
- Experience with version control systems (TFVC and Git).
- Strong problem-solving skills and attention to detail.
- Must be a United States citizen with a DoD Secret clearance or ability to obtain a favorably adjudicated T3 investigation.
Preferred Qualifications:
- Experience with microservices architecture
- Experience with CI/CD pipelines and DevOps best practices
- Experience with cloud platforms (Azure preferred)
- Experience with caching strategies (Redis, in-memory caching)
- Experience with performance profiling and monitoring tools
- Experience with containerization (Docker, Kubernetes)
- Experience with automated testing frameworks
Work Environment and Physical Demand:
- Must be able to work for extended periods of time at a computer.
Compensation: $115,000+
At AGE Solutions, we reward performance, invest in growth, and share success. Our benefits support the whole person, professionally, financially, and personally.
- 26 Days Paid Leave: Includes vacation, sick, personal time, and holidays. You choose how to use it.
- Performance Bonuses: Performance bonuses are awarded based on individual contributions and company-wide results, aligning recognition with impact.
- 401(k) with Match: We match 3% of your contributions with immediate vesting.
- Financial Protection: Company-paid life insurance up to $300K and options for additional coverage for you and your dependents.
- Health Benefits: Multiple medical plans, dental, vision, FSA and HSA options to fit your needs.
- Parental Leave: 15 days of fully paid leave for new parents, because family matters.
- Military Differential Pay: We bridge the gap for employees on active duty, so they don't take a financial hit while serving.
- Professional Growth: Paid training and certifications, tuition reimbursement, and the tools and tech to get the job done right.
- Shared Success: In the event of a company sale, our CEO has committed to returning 80% of net proceeds to employees. This ensures our team shares in the long term value they help create.
At AGE, you'll do work that matters, supported by a company that delivers for its people.
Able to operate independently in low structure environments, collaborate across business and IT, and deliver high quality, AI ready data ecosystems.
Role Purpose Establish, advance, and mature data quality and governance capabilities in a green field, low maturity data environment.
Support enterprise analytics, BI, and AI/ML readiness through SQL/ETL engineering, data profiling, validation, stewardship, metadata management, and early stage data architecture.
Drive long term improvement of data standards, definitions, lineage, and quality processes.
Key Responsibilities Data Quality & Engineering Perform data audits, profiling, validation, anomaly detection, and quality gap identification.
Develop automated data quality rules and validation logic using T SQL, SQL Server, stored procedures, and indexing strategies.
Build and maintain SSIS packages for validation, cleansing, transformation, and error detection workflows.
Troubleshoot ETL/ELT pipelines, data migrations, integration failures, and data load issues.
Conduct root cause analysis and implement preventive and long term remediation solutions.
Optimize SQL queries, tune stored procedures, and improve data processing performance.
Document audit findings, validation processes, data flows, standards, and quality reports.
Build dashboards and reports for data quality KPIs using Power BI/Tableau.
Data Stewardship & Governance Define, maintain, and enforce data quality standards, business rules, data definitions, and governance policies.
Monitor datasets for completeness, accuracy, timeliness, consistency, and compliance.
Ensure proper and consistent data usage across departments and systems.
Maintain business glossaries, data dictionaries, metadata repositories, and lineage documentation.
Partner with IT, data engineering, and business teams to support governance initiatives and compliance requirements.
Provide training on data entry, data handling, stewardship practices, and data literacy.
Collaborate with cross functional teams to identify recurring data issues and recommend preventive solutions.
GreenField / LowMaturity Environment Architect initial data quality frameworks, validation layers, governance artifacts, and ingestion patterns.
Establish scalable data preparation workflows supporting analytics, BI, and AI/ML readiness.
Mature data quality and governance processes from ad hoc to standardized, automated, and measurable.
Drive adoption of data quality and governance practices across business and technical teams.
Support long term evolution of enterprise data strategy and governance maturity.
Required Technical Skills Advanced T SQL, SQL Server development, debugging, and performance tuning.
SSIS development, deployment, and troubleshooting.
Data profiling, validation rule design, quality scoring, and measurement techniques.
ETL/ELT pipeline design, debugging, and optimization.
Data modeling (conceptual, logical, physical).
Metadata management and lineage documentation.
Reporting and dashboarding with Power BI, Tableau, or similar tools.
Strong documentation and communication skills.
Preferred Skills Knowledge of DAMA DMBoK, DCAM, MDM concepts, and governance frameworks.
Experience in low maturity/green field data environments.
Familiarity with AI/ML data readiness and feature store aligned data structuring.
Cloud data engineering exposure (Azure, Databricks, GCP).
Education Bachelor’s degree in Information Systems, Computer Science, Data Science, Statistics, Business Analytics, or related field.
Master’s degree preferred.
Certifications (Preferred) DAMA CDMP (Associate/Practitioner) EDM Council DCAM ASQ Data Quality Credential Collibra Data Steward Certification Certified Data Steward (eLearningCurve) Cloud/AI certifications (Azure, Databricks, Google)
Title: Product Manager
Location : Rockville, MD or McLean, VA
Target Start Date : ASAP
Type: contract
Pay Rate: DOE
The Product Manager is responsible for defining product vision, strategy, roadmap, and feature development for a portfolio of market surveillance products. This portfolio includes a large set of existing surveillance patterns as well as machine learning and deep learning models, some of which are undergoing redesign.
This role requires a highly autonomous product leader who can manage multiple feature initiatives simultaneously while partnering closely with business stakeholders to understand regulatory needs, operational pain points, and opportunities for improvement. The Product Manager will maintain product backlogs, prioritize enhancements, define roadmaps, and ensure the successful delivery of new surveillance capabilities and enhancements to existing systems.
The role also requires familiarity with financial market data sources, including audit trail data, exchange data, and reference data, and how these datasets support regulatory and compliance objectives.
Key ResponsibilitiesProduct Strategy & User Insight
-
Develop a deep understanding of the business domain, regulatory objectives, and available data sources.
-
Define product vision, strategy, and requirements based on user needs, regulatory priorities, and data insights.
-
Conduct research, analyze user feedback, and leverage data analysis to identify gaps and opportunities for improvement.
-
Translate insights into product requirements and actionable development initiatives.
Product Portfolio & Roadmap Management
-
Own and manage the product roadmap for surveillance capabilities.
-
Evaluate trade-offs and prioritize features based on user value, regulatory impact, and resource constraints.
-
Partner with business stakeholders, engineering, architecture, and UX teams to ensure alignment and successful delivery.
-
Coordinate with dependent teams across the organization to support integrated product development.
Product Planning & Delivery
-
Lead product planning by developing requirements, including user stories, acceptance criteria, and use cases.
-
Maintain a prioritized product backlog aligned with product strategy and delivery capacity.
-
Collaborate with UX/UI teams to guide user experience design.
-
Participate in development reviews, validate acceptance criteria, and ensure product quality.
-
Identify risks or issues that may impact delivery timelines or product performance and develop mitigation plans.
Product Launch & Adoption
-
Lead product launches and coordinate with stakeholders on rollout planning.
-
Facilitate user acceptance testing (UAT) where required.
-
Develop supporting documentation and training materials.
-
Track launch metrics, gather user feedback, and drive iterative improvements.
Product Operations & Continuous Improvement
-
Monitor product performance, usage trends, and operational metrics.
-
Work with internal users and stakeholders to resolve product issues and identify enhancement opportunities.
-
Evaluate new metrics and monitoring capabilities to improve product performance and visibility.
-
Represent the product team in stakeholder discussions and business reviews.
Team Development
-
Provide guidance and mentorship to junior product management team members.
-
Stay informed on industry trends, regulatory developments, and product management best practices.
-
Strong analytical and problem-solving skills with the ability to interpret complex data.
-
Experience using database queries (e.g., SQL) and data analysis to inform product decisions.
-
Excellent written and verbal communication skills.
-
Ability to manage multiple priorities and make decisions in complex environments.
-
Strong organizational skills and attention to detail.
-
Self-starter with the ability to dive deeply into business processes and technical capabilities.
Required
-
Bachelor's degree in Business, Finance, Engineering, Communications, or a related field (or equivalent experience).
-
5+ years of experience in product management, compliance, business analysis, program management, or related roles.
-
Experience with database querying (e.g., SQL) and data analysis.
-
Experience working within the software development lifecycle.
-
Demonstrated ability to collaborate across teams in large organizations and work closely with leadership.
Preferred
-
Experience with broker-dealer operations, market surveillance, or regulatory compliance.
-
Experience guiding cross-functional teams and influencing stakeholders.
-
Hybrid work environment with remote and in-office collaboration.
-
Occasional extended hours may be required.
Welcome to ConsultNet, a premier national provider of technology talent and solutions. Our expertise spans across project services, contract-to-hire, direct search, and managed services onshore, nearshore, and hybrid. For over 25 years, we have connected thousands of consultants with meaningful roles through a personal, communication-driven approach, partnering with a diverse client base to build high-performing teams and create lasting impact. Our comprehensive service offerings cover a wide range of technology and engineering positions across key markets nationwide. Learn more at .
We champion equality and inclusivity, proudly supporting an Equal Opportunity Employer policy. We welcome applicants regardless of Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other status protected by law.
Meet REVOLVE:
REVOLVE is the next-generation fashion retailer for Millennial and Generation Z consumers. As a trusted, premium lifestyle brand, and a go-to online source for discovery and inspiration, we deliver an engaging customer experience from a vast yet curated offering totaling over 45,000 apparel, footwear, accessories and beauty styles. Our dynamic platform connects a deeply engaged community of millions of consumers, thousands of global fashion influencers, and more than 500 emerging, established and owned brands. Through 16 years of continued investment in technology, data analytics, and innovative marketing and merchandising strategies, we have built a powerful platform and brand that we believe is connecting with the next generation of consumers and is redefining fashion retail for the 21st century. For more information please visit REVOLVE the most successful team members have a thirst and the creativity to make this the top e-commerce brand in the world. With a team of 1,000+ based out of Cerritos, California we are a dynamic bunch that are motivated by getting the company to the next level. It’s our goal to hire high-energy, diverse, bright, creative, and flexible individuals who thrive in a fast-paced work environment. In return, we promise to keep REVOLVE a company where inspired people will always thrive.
To take a behind the scenes look at the REVOLVE “corporate” lifestyle check out our Instagram @REVOLVEcareers or #lifeatrevolve.
Are you ready to set the standard for Premium apparel?
Main purpose of the Senior Data Science Analyst role:
Use a diverse skill sets across math and computer science, dedicated to solving complex and analytically challenging problems here at Revolve.
Major Responsibilities:
Essential Duties and Responsibilities include the following. Other duties may be assigned.
- Partner closely with business leaders in Marketing, Product, Operations, Buying team to plan out valuable data science projects
- Conduct complex analysis and build models to uncover key learning form data, leading to appropriate strategy recommendations.
- Work closely with the DBA to improve BI’s infrastructure, architect the reporting system, and invest in time for technical proof of concept.
- Work closely with the business intelligence and tech team to define, automate and validate the extraction of new metrics from various data sources for use in future analysis
- Work alongside business stakeholders to apply our findings and models in website personalization, product recommendations, marketing optimization, to fraud detection, demand forecast, CLV prediction.
Required Competencies:
To perform the job successfully, an individual should demonstrate the following competencies:
- Outstanding analytical skills, with strong academic background in statistics, math, science or technology.
- High comfort level with programming, ability to learn and adopt new technology with short turn-around time.
- Knowledge of quantitative methods in statistics and machine learning
- Intense intellectual curiosity – strong desire to always be learning
- Proven business acumen and results oriented.
- Ability to demonstrate logical thinking and problem solving skills
- Strong attention to detail
Minimum Qualifications:
- Master Degree is required
- 3+ years of DS and ML experience in a strong analytical environment.
- Proficient in Python, NumPy and other packages
- Familiar with statistical and ML methodology: causal inference, logistic regression, tree-based models, clustering, model validation and interpretations.
- Experience with AB Testing and pseudo-A/B test setup and evaluations
- Advanced SQL experience, query optimization, data extract
- Ability to build, validate, and productionize models
Preferred Qualifications:
- Strong business acumen
- Experience in deploying end to end Machine Learning models
- 5+ years of DS and ML experience preferred
- Advanced SQL and Python, with query and coding optimization experience
- Experience with E-commerce marketing and product analytics is a plus
A successful candidate works well in a dynamic environment with minimal supervision. At REVOLVE we all roll up our sleeves to pitch-in and do whatever it takes to get the job done. Each day is a little different, it’s what keeps us on our toes and excited to come to work every day.
A reasonable estimate of the current base salary range is $120,000 to $150,000 per year.
Sr. Full Stack Engineer
Job ID
2025-2140
# of Openings
1
Overview
Currently seeking multiple Full Stack Developers in support of the of U.S. Citizenship and Immigration Services (USCIS) Engineering Support for Identity Services (ESIS), this individual will support Agile Application Development technologies and capabilities in the areas of software development, systems engineering, integration, and test of software applications and infrastructure. Will be skilled with front-end, back-end, and database development. Design and implement full stack cloud solutions to include IaaS, PaaS, and SaaS. Design and deploy computing infrastructure, physical or virtual machines and other resources like virtual-machine disk image library, block and file-based storage, firewalls, load balancers, IP addresses, virtual local area networks. Implement cloud-based platform services for AWS. Implement cloud-based software as service for AWS. Perform DevOps functions.
Key Skills:
- 10+ years of experience with full stack engineering with proficiency in database development/integration as well as server and client application development/integration
- Software developing experience using Python and Java Spring framework
- Experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- AWS experience required with experience deploying enterprise applications in AWS
- Experience with CI/CD environment tools such as Docker, Jenkins, Ansible, Kubernetes
Responsibilities
- Software development with Python, Java, React, and various scripting languages
- Design data models and web APIs and creation of software tasks from system requirements
- Perform requirements analysis, design, development, unit, and integration testing of software, troubleshooting and debugging of the system
- Immediate responsibilities will include enhancing and maintaining the existing system as well as design, development, and documentation of new features
- Create Git Releases, pull request and code reviews
- Query logs utilizing Splunk and will monitor dashboarding utilizing New Relic
- Usage of Atlassian Tools for day to day tasks within the Scrum process
- Implement web services, data persistence access features and external interfaces
- Partner closely with front-end and database engineers to ensure features are developed holistically
- Follow Agile software development methodology and team architecture standards.
- Will need to be able to read Architecture Diagrams
- Perform test service to improve code coverage, mocking services, test driven development and unit testing
- Will modify Helm Charts, Jenkinsfiles, and Dockerfiles
Qualifications
- MUST BE US CITIZEN
- Bachelor's degree required
- Must be able to obtain and maintain a Public Trust security clearance
- 10+ years expereince in Software Engineering
- Must have experience in Python and Java Spring Framework (Boot, Batch, Data, Security)
- Must have experience with other software technologies such as Web Services (SOAP/REST), React/Angular, VS Code, SQL, Gradle, and/or Git
- Experience with design, development, enhancement, troubleshooting and debugging of web applications
- Must have experience in AWS cloud environment and with CI/CD tools (ie. Docker, Jenkins, Kubernetes) for deployment processes, monitoring production environments, and modifying docker/Jenkins files and helm charts
- Experience with scripting languages (Python, Bash, Powershell, Perl) is not required but nice to have
- Understanding of the concept of branching and utilizing technological tools such as Git, VS Code, and/or Rancher to perform
- Experience with creating Git releases, creating pull requests, and reviewing code
- Experience monitoring dashboards utilizing New Relic
- Experience with Splunk to query logs
- Experience with Junit testing preferred
- Experience creating release instructions utilizing JIRA
- Experience developing and integrating complex software systems through the full SDLC
- Experience with Agile Scrum
- Must have strong written and verbal communication skills
Target Pay Range
The below listed pay range for this position is not a guarantee of compensation or salary. The final offered salary will be influenced by a host of factors including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior work experience, specific skills and competencies, education, and certifications. Our employees value the flexibility at Pyramid Systems that allows them to balance quality work and their personal lives. We offer competitive compensation, benefits, to include our Employee Stock Ownership Program, FlexPTO, and learning and development opportunities.
Pyramid Min
USD $125,731.00/Yr.
Pyramid Max
USD $188,597.00/Yr.
Why Pyramid?
Pyramid Systems, Inc. is an award-winning, technology leader, driving digital transformation across federal agencies. We empower forward-thinking innovations, accelerate production-ready software, and deliver secure solutions so federal agencies can meet their mission goals. Voted a Top Workplace, both regionally (Washington, DC) and Nationally (USA) the past 2 years (2023 and 2024) based on the feedback from our employees, we are headquartered in Fairfax, VA. and have a growing national footprint. We value and promote our Flexible Workplace approach because of the positive impacts it has on work-life integration. We remain committed to ensuring every employee's voice is heard, performance and results are recognized and rewarded, development and advancement is a focus, and diversity, equity and inclusion is a company priority. We offer competitive compensation and benefits (including a recently launched Employee Stock Ownership Plan - ESOP), a robust performance-based rewards program, and we know how to have fun! Our people and culture have endured and delivered for our clients for nearly three decades.
EEO Statement
Pyramid Systems, Inc. is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or protected veteran status and will not be discriminated against on the basis of disability.
Job Description:
We believe in bold ideas, diverse perspectives, and the drive to transform knowledge into impact. Here, your curiosity fuels progress, your voice shapes innovation, and your ambition helps redefine what's possible within science and learning. We are a culture that obsesses over impact, challenges, and drives what's next to power infinite possibilities for our customers, colleagues and society at large.
About the Role:
We are seeking a highly skilled and experienced Data Analyst with Power BI Specialty to join a dynamic team of marketing analysts in driving forward our mission to transform business and ad tech data into valuable marketing insights. As a Data Analyst / Power BI Specialist in the Wiley Marketing team, you will play a crucial role developing strategically important shared data models for reports that influence strategic decision-making, shed light on campaign performance, and ensure the success of our publishing initiatives. We're looking for a methodical, future-thinker, who can support and guide others in the team to do the same. We have rich data landscape at our fingertips; this role is a key advocate in its practical application to benefit marketing effectiveness.
How you will make an impact:
Develop and maintain shared semantic models in Power BI using star schema design.
Develop SQL queries to extract and manipulate data from our BigQuery database for use in Power BI.
Define and implement best practice report/data management through Microsoft Fabric integrations
Review and maintain reports for performance optimizations, style consistency, and best practice improvements.
Consult on and support building a data model and structure that enables efficient and accurate reporting and analysis.
Consult on wireframe design during the early stages of building new reports to provide guidance on effective visualization and user experience.
Consult on the development of strategically insightful reporting dashboards with actionable insights to support key stakeholder decision-making processes.
We are looking for people who have:
Experience (4+ years) in Data Modelling, Data Analysis, or similar.
Demonstrated experience with Power BI, semantic models, and other Microsoft Fabric tools.
Demonstrated proficiency creating SQL queries to manipulate large datasets.
Strong analytical and problem-solving skills, with a focus on best practice and data governance.
Strong attention-to-detail, with the ability to organize and maintain datasets using meticulous, self-determined methods.
Strong communication skills, with the ability to define requirements and explain technical concepts to both technical and non-technical stakeholders.
Experience with BigQuery or other data warehousing platforms.
We power infinite possibilities.
For more than 200 years, we've transformed knowledge into discoveries that shape the world. Today, our global team of innovators, creators, and experts is driving what's next in science, education, and publishing-creating impact that reaches everywhere.
We're not just observers of progress. We're the ones accelerating scientific breakthroughs, advancing learning, and sparking innovation that redefines entire fields and improves lives.
Here, your talent matters. Your ideas have room to grow. And your work creates breakthroughs that can change everything.
Wiley is an equal opportunity/affirmative action employer. We evaluate all qualified applicants and treat all qualified applicants and employees without regard to race, color, religion, sex, sexual orientation, gender identity or expression, national origin, disability, protected veteran status, genetic information, or based on any individual's status in any group or class protected by applicable federal, state or local laws. Wiley is also committed to providing reasonable accommodation to applicants and employees with disabilities. Applicants who require accommodation to participate in the job application process may contact for assistance.
We are proud that our workplace promotes continual learning and internal mobility. Our values support courageous teammates, needle movers, and learning champions all while striving to support the health and well-being of all employees. We offer meeting-free Friday afternoons allowing more time for heads down work and professional development, and through a robust body of employee programing we facilitate a wide range of opportunities to foster community, learn, and grow.
We are committed to fair, transparent pay, and we strive to provide competitive compensation in addition to a comprehensive benefits package. The range below represents Wiley's good faith and reasonable estimate of the base pay for this role at the time of posting roles either in the United Kingdom, Canada or USA. It is anticipated that most qualified candidates will fall within the range, however the ultimate salary offered for this role may be higher or lower and will be set based on a variety of non-discriminatory factors, including but not limited to, geographic location, skills, and competencies.
When applying, please attach your resume/CV to be considered.
Salary Range:
85,500 USD to 122,567 USD#LI-SCJob Posting Title:
Data Analyst - Power BI SpecialistLocation:
Hoboken (HQ), NJ, USASr. Data Engineer (PySpark & Python + AI Tools Exp.) - (Only W2 or 1099)
Charlotte, NC (Hybrid)
12+ Months Contract
Job Description:
We are currently seeking a Senior Data Engineer with hands-on coding experience and a strong background in Python, PySpark, and Object-oriented programming.
The ideal candidate will be responsible for designing, developing, and implementing new features to our existing framework using PySpark and Python.
This position requires a deep understanding of data transformation and the ability to create standalone scripts based on given business logic. Also, exposure to AI Tools and building any AI applications will be advantage.
Key Responsibilities:
- Design, develop, and optimize large-scale data pipelines using PySpark and Python.
- Implement and adhere to best practices in object-oriented programming to build reusable, maintainable code.
- Write advanced SQL queries for data extraction, transformation, and loading (ETL).
- Collaborate closely with data scientists, analysts, and stakeholders to gather requirements and translate them into technical solutions.
- Troubleshoot data-related issues and resolve them in a timely and accurate manner.
- Leverage AWS cloud services (e.g., S3, EMR, Lambda, Glue) to build and manage cloud-native data workflows (preferred).
- Participate in code reviews, data quality checks, and performance tuning of data jobs.
Required Skills & Qualifications:
- 6+ years of relevant experience in a data engineering or backend development role.
- Strong hands-on experience with PySpark and Python, especially in designing and implementing scalable data transformations.
- Solid understanding of Object-Oriented Programming (OOP) principles and design patterns.
- Proficient in SQL, with the ability to write complex queries and optimize performance.
- Strong problem-solving skills and the ability to troubleshoot complex data issues independently.
- Excellent communication and collaboration skills.
- Hands-on experience with AI Tools.
Preferred Qualifications (Nice to Have):
- Experience working with AWS cloud ecosystem (S3, Glue, EMR, Redshift, Lambda, etc.).
- Exposure to data warehousing concepts, distributed computing, and performance tuning.
- Familiarity with version control systems (e.g., Git), CI/CD pipelines, and Agile methodologies.
- Exposure to AI Tools and hands-on experience of building any AI applications.
Must have
Teradata platform expertise
• Deep knowledge of Teradata architecture: parsing, BYNET, AMP, vproc, fallback, hashing, PDCR, fallback, and spool management.
• Data distribution and primary index design; collecting statistics and understanding optimizer behavior.
• Experience with recent Teradata versions and releases migration/upgrade planning: TD 16.XX, TD 17.XX and preferably TD 20.XX.
System administration
• Provisioning and managing Teradata nodes and clusters (physical and virtual).
• OS-level skills: Linux administration (SLES/RHEL/CentOS/Oracle Linux) for Teradata on Linux, including kernel tuning, package management, user and permissions management.
• Storage subsystem knowledge: SAN, NAS, Fibre Channel, LUNs, RAID, and how storage impacts Teradata I/O and spool.
Performance tuning and troubleshooting
• SQL query and plan analysis; collecting and interpreting Explain plans.
• Workload management (WLM) and resource allocation: query prioritization, throttling, and KRI/SLAs.
• Monitoring and diagnostics: using Teradata tools and logs to analyze spool, CPU, memory, disk I/O, network, BYNET contention.
Backup, recovery & high availability
• Best practices for backups restore procedures, and disaster recovery (DR) planning and testing.
• Knowledge of fallback, AMP resilience, replication methods and physical vs logical protection.
Security & compliance
• DB and platform-level security: roles, privileges, LDAP/Kerberos integration, encryption (at rest/in transit), auditing and compliance (SOx and Others as applicable).
• Secure configuration and hardening practices.
Networking & infrastructure
• Network architecture for Teradata clusters, VLANs, link aggregation, low-latency requirements, and BYNET tuning.
• Integration with enterprise infrastructure: DNS, NTP, monitoring stacks, and identity providers.
Automation, scripting & tools
• Scripting languages: Bash, Python, Perl for automation, maintenance, and custom monitoring. – one of them
• Configuration management and automation tools: Ansible, Terraform, Chef, or Puppet (as used in the enterprise). – one of them
• Familiarity with Teradata utilities and tools: BTEQ, FastLoad, MultiLoad, TPT (Teradata Parallel Transporter), DBSControl, Viewpoint, Teradata Studio/SQL Assistant. – one of them
Observability & tooling
• Use of monitoring/alerting tools (Viewpoint, Prometheus, Grafana, Splunk, Nagios, etc.) and designing dashboards and alerts. One of them, View point is mandatory
• Capacity planning, trending, and forecasting for CPU, disk, spool, and concurrency.
Soft skills & organizational capabilities
• Incident management and on-call experience
• Leading postmortems, RCA (root-cause analysis), implementing corrective actions.
• Communication and stakeholder management: vendors, management and applications
• Translate technical impacts to business stakeholders; coordinate with DBAs, developers, network/storage teams, and vendors.
Role and Responsibilities
Installs, configures and upgrades Teradata software and related products.
• Backup, restore, migrate Teradata data and objects
• Establish and maintain backup and recovery policies and procedures.
• Manages and monitor system performance. proactively monitor the database systems to ensure secure services with minimum downtime
• Implements and maintains database security.
• Sets up and maintains documentation and standards.
• Supports multiple Teradata Systems including independent marts/ enterprise warehouse.
• Work with the team to ensure that the associated hardware resources allocated to the databases and to ensure high availability and optimum performance.
• Responsible for improvement and maintenance of the databases to include rollout and upgrades.
• Responsible for implementation and release of database changes as submitted by the development team, Working with end customer
• Teradata, customer, datacenter, vendor co-ordinations
• Forecast data, security audits
• User account and access management
• Teradata active system management and customer requests and system allocation
• Backup and recovery
• SOX compliance and audits
• DB support from 3rd party vendors
• Product evaluations
• On call support and major incidents
• Backup restore, frequency and retention
• Disaster recovery
• Create long r
About the Job:
The Solutions Architect serves as the technical anchor of the Strategic Resource Group. This role is responsible for translating complex business and market questions into structured, executable data outputs using Trilliant Health’s proprietary claims, provider directory, and price transparency datasets.
The Solutions Architect owns feasibility validation, analytical methodology design, and data integrity across research initiatives and pre-sales support. This individual combines strong technical proficiency with healthcare domain expertise and plays a critical role in standardizing how recurring strategic questions are answered across the organization.
You are our ideal candidate if you:
- Design and execute complex SQL queries and data builds from Trilliant’s data warehouse
- Capture and maintain documentation outlining how and why analytical frameworks are applied to support consistency and institutional knowledge retention
- Validate data integrity and identify gaps, missingness, structural limitations, or edge cases
- Own technical feasibility assessments for research and pre-sales opportunities
- Develop repeatable analytical frameworks for common strategic use cases
- Support research initiatives through structured dataset construction and methodological validation
- Create reusable datasets, templates, and documentation to reduce institutional knowledge concentration
- Maintain high standards of quality control and analytical rigor across all deliverables
- Interface effectively with Sales, SRG, Research, Product, and Data Engineering teams
- Respond to ambiguity with structured problem solving and professional judgment
Technical Skills:
- Advanced proficiency in SQL and experience querying large data warehouses
- Experience working in Databricks or similar environments preferred
- Strong proficiency in Excel and PowerPoint
- Familiarity with Tableau or other BI tools
- Experience working with complex healthcare claims datasets required
Other Skills:
- Strong analytical and critical thinking skills
- Ability to synthesize large datasets into structured outputs
- Excellent documentation and organizational skills
- Strong written and verbal communication skills
- Ability to work independently with minimal supervision
- High attention to detail and commitment to data quality
Position Location:
This position is onsite in Brentwood, TN
*We are unable to provide visa sponsorships for this role.
About Trilliant Health:
Trilliant Health is a high-growth, healthcare technology company. We are on a mission to be the most trusted advisor, dependable partner and provider of analytic insights to key stakeholders in the health economy enabling them to maximize return on invested capital. We do that by providing education and expertise through thought leadership, evidence-based strategy, and predictive analytics. We are looking to grow our team as we strive to influence positive change in healthcare by disrupting the status quo and promoting improved decision-making.
About the Job:
The Applied Analytics Analyst serves as the technical anchor of the Strategic Resource Group. This role is responsible for translating complex business and market questions into structured, executable data outputs using Trilliant Health’s proprietary claims, provider directory, and price transparency datasets.
The Applied Analytics Analyst owns feasibility validation, analytical methodology design, and data integrity across research initiatives and pre-sales support. This individual combines strong technical proficiency with healthcare domain expertise and plays a critical role in standardizing how recurring strategic questions are answered across the organization.
You are our ideal candidate if you:
- Design and execute complex SQL queries and data builds from Trilliant’s data warehouse
- Capture and maintain documentation outlining how and why analytical frameworks are applied to support consistency and institutional knowledge retention
- Validate data integrity and identify gaps, missingness, structural limitations, or edge cases
- Own technical feasibility assessments for research and pre-sales opportunities
- Develop repeatable analytical frameworks for common strategic use cases
- Support research initiatives through structured dataset construction and methodological validation
- Create reusable datasets, templates, and documentation to reduce institutional knowledge concentration
- Maintain high standards of quality control and analytical rigor across all deliverables
- Interface effectively with Sales, SRG, Research, Product, and Data Engineering teams
- Respond to ambiguity with structured problem solving and professional judgment
Technical Skills:
- Advanced proficiency in SQL and experience querying large data warehouses
- Experience working in Databricks or similar environments preferred
- Strong proficiency in Excel and PowerPoint
- Familiarity with Tableau or other BI tools
- Experience working with complex healthcare claims datasets required
Other Skills:
- Strong analytical and critical thinking skills
- Ability to synthesize large datasets into structured outputs
- Excellent documentation and organizational skills
- Strong written and verbal communication skills
- Ability to work independently with minimal supervision
- High attention to detail and commitment to data quality
Position Location:
This position is onsite in Brentwood, TN
*We are unable to provide visa sponsorships for this role.
About Trilliant Health:
Trilliant Health is a high-growth, healthcare technology company. We are on a mission to be the most trusted advisor, dependable partner and provider of analytic insights to key stakeholders in the health economy enabling them to maximize return on invested capital. We do that by providing education and expertise through thought leadership, evidence-based strategy, and predictive analytics. We are looking to grow our team as we strive to influence positive change in healthcare by disrupting the status quo and promoting improved decision-making.