Query With Example Jobs in Usa

844 positions found — Page 7

Sr. Data Engineer (PySpark & Python + AI Tools Exp.) - (Only W2 or 1099)
✦ New
Salary not disclosed
Charlotte, NC 4 hours ago

Sr. Data Engineer (PySpark & Python + AI Tools Exp.) - (Only W2 or 1099)

Charlotte, NC (Hybrid)

12+ Months Contract


Job Description:


We are currently seeking a Senior Data Engineer with hands-on coding experience and a strong background in Python, PySpark, and Object-oriented programming.


The ideal candidate will be responsible for designing, developing, and implementing new features to our existing framework using PySpark and Python.


This position requires a deep understanding of data transformation and the ability to create standalone scripts based on given business logic. Also, exposure to AI Tools and building any AI applications will be advantage.


Key Responsibilities:

  • Design, develop, and optimize large-scale data pipelines using PySpark and Python.
  • Implement and adhere to best practices in object-oriented programming to build reusable, maintainable code.
  • Write advanced SQL queries for data extraction, transformation, and loading (ETL).
  • Collaborate closely with data scientists, analysts, and stakeholders to gather requirements and translate them into technical solutions.
  • Troubleshoot data-related issues and resolve them in a timely and accurate manner.
  • Leverage AWS cloud services (e.g., S3, EMR, Lambda, Glue) to build and manage cloud-native data workflows (preferred).
  • Participate in code reviews, data quality checks, and performance tuning of data jobs.


Required Skills & Qualifications:

  • 6+ years of relevant experience in a data engineering or backend development role.
  • Strong hands-on experience with PySpark and Python, especially in designing and implementing scalable data transformations.
  • Solid understanding of Object-Oriented Programming (OOP) principles and design patterns.
  • Proficient in SQL, with the ability to write complex queries and optimize performance.
  • Strong problem-solving skills and the ability to troubleshoot complex data issues independently.
  • Excellent communication and collaboration skills.
  • Hands-on experience with AI Tools.


Preferred Qualifications (Nice to Have):

  • Experience working with AWS cloud ecosystem (S3, Glue, EMR, Redshift, Lambda, etc.).
  • Exposure to data warehousing concepts, distributed computing, and performance tuning.
  • Familiarity with version control systems (e.g., Git), CI/CD pipelines, and Agile methodologies.
  • Exposure to AI Tools and hands-on experience of building any AI applications.
Not Specified
Contract ODI Developer - Hybrid Onsite in Boston MA - USC OR GC ONLY
✦ New
Salary not disclosed
Boston, MA, Hybrid 4 hours ago
Please send current resumes directly to
Bhagyashree Yewle, Principal Lead Recruiter - YOH SPG
ODI Developer - Hybrid Onsite in Boston MA - USC OR GC ONLY (No Visas)
  • Location: Boston, MA
  • Hybrid: 3 days on site
  • Potential Convert: Yes, USC/GC ONLY no exceptions. WILL NOT SPONSOR
Tope 5 Must haves:
  • ETL/ELT
  • ODI
  • PL/SQL coding
  • 7 years’ experience
  • Knowledge on how to be an admin side of things (not day to day but is able to do that)
  • Scripting – Python & Unix Scripting
Role Overview:
Seeking a highly skilled and experienced Sr. ODI Developer to join our Private Banking Systems team. The ideal candidate will possess expertise in a range of technologies, including ODI (Oracle Data Integrator), Oracle Data Warehouse, Linux, Python scripting, and have a deep understanding of the Banking domain is a big plus. As a Data Engineer, you will play a pivotal role in designing, developing, and maintaining data solutions.

Key Responsibilities:
  • Build ODI mappings/interfaces, packages, procedures, scenarios, topology configuration, ODI Agent and load plans to integrate data from multiple enterprise systems.
  • Expertise in building Pl/SQL queries, procedures, data loading process, ensuring high-performance and scalability to meet the evolving data needs of the various applications.
  • Design, develop, and maintain ETL/ELT pipelines using Oracle Data Integrator (ODI).
  • Collaborate effectively with cross-functional teams, including other data engineers, DBA group, analysts, and business stakeholders, to understand data requirements and deliver solutions.
  • Monitor and troubleshoot RMJ jobs, ODI workflows, sessions, agents, and data pipelines on Linux environments.
  • Perform root cause analysis for failures related to ODI workflows, RMJ jobs, network connectivity, API integrations, and file transfers.
  • Optimize ETL workflows to improve reliability, performance, and scalability.
  • Use scripting and automation tools to support data processing and operational workflows.
  • Work in Linux/Unix environments, using command-line tools and shell scripts for job automation and troubleshooting.
  • Maintain comprehensive documentation of data processes, configurations, and best practices.
  • Participate in walk-throughs which review program specifications, source code, and all technical supporting documentation, including screens/reports. Provide feedback in accordance with team standards and guidelines.
  • Participate in implementation of changes, enhancements, and newly developed programs.
  • Conduct technical research and provide recommendations, develop proofs of concept or prototypes, contributing to technical design of applications.
  • Helping to identify coding patterns and anti-patterns and enforce implementation of the patterns through code reviews.
  • Quickly resolving issues encountered by business lines in the production environment, maintaining a helpful, "high touch" approach to working with business users, performing root cause analysis, technology evaluation, and performance tuning.

Desired Qualifications:

  • Degree in Computer Science, Engineering or related technical area
  • 7+ years of extensive hands-on experience in ODI, Oracle Datawarehouse, Oracle PL/SQL, Linux, Python scripting, and ODI admin module (ODI Agent setup, logs configuration, certificate installation).
  • Must have experience in building Pl/SQL queries for Oracle Server (incl. stored procedures, functions…) and must understand basic principles of data modeling
  • Excellent collaborative and communication skills, particularly in high-stress situations
  • Experience with scripting Python and Linux scripting, CLE, networking fundamentals (API, IP/ports, SFTP/FTP connectivity)
  • High proficiency in development practices: unit testing, Continuous Integration (CI/CD), refactoring, clean code
  • Experience with Bitbucket/GIT source control management
  • Problem solving skills, able to determine upcoming risks & issues and address them accordingly.
  • Ability to interpret and troubleshoot applications using logs.
  • Pro-active approach and good communication skills.
  • Experience with agile methodologies (Scrum, Kanban) and tools (Jira)
Nice to Have:
  • Private Banking domain experience.
  • Working experience in a financial service industry
  • Financial application knowledge like FIS AddVantage, CRD, CRM Pivotal.
  • Experience with Apache Airflow for workflow orchestration.
  • Knowledge of dbt (Data Build Tool) for modern data transformations.
  • Exposure to cloud data platforms or hybrid data architectures.

Key Competencies:

  • Strong analytical and problem-solving skills
  • Ability to work with large-scale enterprise data environments
  • Excellent collaboration and communication skills
  • Ability to manage multiple priorities in a fast-paced environment
  • Commitment to continuous learning and technology innovation

Estimated Min Rate: $55.00

Estimated Max Rate: $72.00

What’s In It for You?
We welcome you to be a part of the largest and legendary global staffing companies to meet your career aspirations. Yoh’s network of client companies has been employing professionals like you for over 65 years in the U.S., UK and Canada. Join Yoh’s extensive talent community that will provide you with access to Yoh’s vast network of opportunities and gain access to this exclusive opportunity available to you. Benefit eligibility is in accordance with applicable laws and client requirements. Benefits include:

  • Medical, Prescription, Dental & Vision Benefits (for employees working 20+ hours per week)
  • Health Savings Account (HSA) (for employees working 20+ hours per week)
  • Life & Disability Insurance (for employees working 20+ hours per week)
  • MetLife Voluntary Benefits
  • Employee Assistance Program (EAP)
  • 401K Retirement Savings Plan
  • Direct Deposit & weekly epayroll
  • Referral Bonus Programs
  • Certification and training opportunities

Note: Any pay ranges displayed are estimations. Actual pay is determined by an applicant's experience, technical expertise, and other qualifications as listed in the job description. All qualified applicants are welcome to apply.

Yoh, a Day & Zimmermann company, is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Visit to contact us if you are an individual with a disability and require accommodation in the application process.

For California applicants, qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. All of the material job duties described in this posting are job duties for which a criminal history may have a direct, adverse, and negative relationship potentially resulting in the withdrawal of a conditional offer of employment.

It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability.

By applying and submitting your resume, you authorize Yoh to review and reformat your resume to meet Yoh’s hiring clients’ preferences. To learn more about Yoh’s privacy practices, please see our Candidate Privacy Notice:  working/work at home options are available for this role.

contract
Procurement Consultant
✦ New
Salary not disclosed
Sunnyvale, CA 4 hours ago

We’re Hiring: Ivalua Techno-Functional Lead Consultant

Location: Sunnyvale, CA (Onsite – 3 days/week)


We are looking for an experienced Ivalua Techno-Functional Lead Consultant with strong expertise in procurement systems, SQL, and integrations. The ideal candidate will have deep hands-on experience working with the Ivalua platform, supporting procurement processes, and collaborating with cross-functional teams to deliver scalable solutions.

This role requires a strong understanding of sourcing and procurement workflows, technical integration capabilities, and the ability to work closely with stakeholders and third-party solution providers.


Key Responsibilities

  • Lead and support Ivalua implementation, configuration, and enhancement projects.
  • Write and optimize SQL queries to generate reports, retrieve data, and support integrations.
  • Build and support API integrations and EAI connections with other enterprise applications.
  • Configure Ivalua modules, including workflows, callbacks, and solution design.
  • Provide production support, including troubleshooting priority bugs and fixes.
  • Work with stakeholders to improve sourcing and procurement processes.
  • Participate in project release processes and system improvements.
  • Maintain strong collaboration with clients and third-party solution providers.


Required Skills & Experience

  • 8+ years of hands-on experience with Ivalua in a project role within industry or consulting.
  • L2 (preferred) or L3 certification in any of the following areas from Ivalua: P2P, INT, or SQL.
  • Strong experience working with SQL queries to build reports, retrieve data, and support integrations.
  • Deep understanding of Ivalua solutions and API development.
  • Strong knowledge of sourcing and procurement processes and their business value drivers.
  • Experience with configuration modules including design, workflows, and callback creation.
  • Experience supporting production issues, priority bugs, and fixes.
  • Knowledge of project release processes.
  • Excellent analytical, problem-solving, and communication skills.
  • Experience building strong client relationships and collaborating with third-party solution providers.
Not Specified
Software Engineer – Java Technologies
✦ New
Salary not disclosed
Rensselaer, NY 4 hours ago

Hiring: Java Software Engineer

Location: Rensselaer Area, NY (Hybrid – 2–3 days onsite)

Employment Type: Contract

Status: Actively Hiring

We are looking for a Java Application Developer to join a dynamic team working on large-scale enterprise systems. This role focuses on backend development, API integration, and database-driven applications.

Key Responsibilities:

• Develop and enhance applications using Core Java, J2EE, and Spring Framework

• Design and optimize SQL queries for performance and reporting needs

• Build and maintain RESTful Web Services / APIs

• Perform unit testing using JUnit and follow standard build processes

• Troubleshoot and resolve issues in enterprise-level applications

• Collaborate with cross-functional teams to deliver scalable solutions

Required Qualifications:

• 2–4 years of hands-on experience with Java technologies (Core Java, J2EE, Spring)

• Strong experience with SQL queries (basic to complex)

• Exposure to large-scale or enterprise application environments

• Familiarity with build tools and unit testing (JUnit)

• Bachelor’s degree in Computer Science, MIS, or related field (or equivalent experience)

• Good problem-solving and communication skills

Not Specified
Software Test Engineer - Hybrid
✦ New
Salary not disclosed

Crown Equipment Corporation is a leading innovator in world-class forklift and material handling equipment and technology. As one of the world’s largest lift truck manufacturers, we are committed to providing the customer with the safest, most efficient and ergonomic lift truck possible to lower their total cost of ownership.

Primary Responsibilities

  • Design, develop, and maintain automated test scripts using Selenium and Intellij IDE.
  • Create reusable and clean code that supports robust testing frameworks. Integrate automated tests into CI/CD pipelines to enable continuous testing.
  • Conduct comprehensive API testing ensuring thorough end-to-end integration across various systems.
  • Perform database testing and writing SQL queries across multiple database management platforms.
  • Analyze requirement specifications focused on determining the viability and feasibility of automation for eligible features.
  • Debug and resolve automation failures.
  • Maintain the automation repository.
  • Manage the execution of automation regression suites.
  • Perform functional and non-functional testing of software products and solutions developed.
  • Perform regression testing of module firmware as needed.
  • Write, revise, and verify quality standards and test procedures for program design and product evaluation to attain quality of software.
  • Develop processes and procedures to test product requirements, use cases, and wireframes in the form of test cases and other documentation.
  • Perform requirement analysis and test estimation of software under test.
  • Design test cases according to the quality standards and procedures.
  • Define test data and test environment requirements to execute defined tests.
  • Perform defect reporting, management and closure as per department standards and procedures.
  • Participate proactively in QA initiatives for continuous improvements according to department objectives.

Minimum Qualifications

  • Bachelor’s degree (Computer Science, Information Systems) and at least 2 years related experience. Non-degree considered if 12+ years of related experience along with a high school diploma or GED
  • Able to automate test scripts using Selenium and Intellij IDE.
  • Proficient in at least one mainstream programming language such as Java, Python, C#, JavaScript/TypeScript.
  • Experience with code versioning and CI/CD tools like GitHub, Jenkins, Bamboo or similar tools to integrate and run the automated tests in pipelines and enable continuous testing.
  • Experience with API testing tools like POSTMAN, SOAP UI, RestAssured.
  • Experience in writing SQL queries and database testing using MySQL, SQL Server, Oracle, or PostgreSQL.
  • Experience in quality assurance methodologies or software testing
  • Good written, verbal, analytical and interpersonal skills.

Work Authorization:

Crown will only employ those who are legally authorized to work in the United States. This is not a position for which sponsorship will be provided. Individuals with temporary visas or who need sponsorship for work authorization now or in the future, are not eligible for hire.


No agency calls please.

Compensation and Benefits:

Crown offers an excellent wage and benefits package for full-time employees including Health/Dental/Vision/Prescription Drug Plan, Flexible Benefits Plan, 401K Retirement Savings Plan, Life and Disability Benefits, Paid Parental Leave, Paid Holidays, Paid Vacation, Tuition Reimbursement, and much more.

EOE Veterans/Disabilities


Remote working/work at home options are available for this role.
Not Specified
Snowflake Admin
✦ New
Salary not disclosed
San Jose, CA 4 hours ago

Job Description

Working on day-to-day Snowflake Admin activities like, Storage Integration (AWS & GCP buckets), Integration setup (Notification, API & Email)

Setup of Inbound & Outbound Data shares as per the project requirements.

IP Allow listing as per the Project requirements, after review/approval of Platform Manager.

New Application Onboarding guidance and provide in-depth knowledge around the costing & billing aspects as per the standards.

Analyzing the Snowflake Warehouse Usage and provide best recommendations as per the trend and cost optimization.

Providing guidance around the usage of Cloud Toolkit API’s and reading the secrets and manage the passwords in Keeper Vault Services.

Configure and optimize Snowflake accounts, virtual warehouses, databases, schemas, and user roles.

Monitor platform performance and ensure optimal configuration and scaling of compute resources.

Implement and manage data security measures including role-based access control, data encryption, and user authentication.

Monitor query performance and optimize SQL queries for better efficiency, Analyze and resolve performance bottlenecks and resource contention issues, Utilize Snowflake’s advanced features such as clustering keys, materialized views, and result caching to enhance performance.

Implement backup and recovery strategies to ensure data durability and availability, Manage data archiving, replication, and failover processes.

Develop and maintain scripts for automating administrative tasks and workflows, Utilize tools and frameworks for continuous integration and deployment (CI/CD).

Provide technical support and troubleshooting for Snowflake-related issues, collaborate with Snowflake support to resolve platform-specific issues

Knowledge in any RDBMS database like Teradata or Oracle

Knowledge of programming languages such as Python or JavaScript.

Experience with data visualization tools (e.g., Tableau, Power BI).

24*7 support model experience

Salary Range: $64,000 - $125,000 a year

#LI-CM2

Not Specified
SENIOR AWS DATA ENGINEER
✦ New
Salary not disclosed
Irving, TX 4 hours ago

Visa Status: US Citizen or Green Card Only

Location: Irving, TX (Local Candidates Only)

Employment Type: Full-time / Direct Hire

Work Environment: Hybrid (Monday thru Thursday - in office / Friday - at home)


***MUST HAVE 10+ YEARS EXPERIENCE AS A DATA ENGINEER***


***US Citizen or Green Card Only***


The AWS Senior Data Engineer will own the planning, design, and implementation of data structures for this leading Hospitality Corporation in their AWS environment. This role will be responsible for incorporating all internal and external data sources into a robust, scalable, and comprehensive data model within AWS to support business intelligence and analytics needs throughout the company.


Responsibilities:

  • Collaborate with cross-functional teams to understand and define business intelligence needs and translate them into data modeling solutions
  • Develops, builds and maintains scalable data pipelines, data schema design, and dimensional data modelling in Databricks and AWS for all system data sources, API integrations, and bespoke data ingestion files from external sources. Includes Batch and real-time pipelines.
  • Responsible for data cleansing, standardization, and quality control
  • Create data models that will support comprehensive data insights, business intelligence tools, and other data science initiatives
  • Create data models and ETL procedures with traceability, data lineage and source control
  • Design and implement data integration and data quality framework
  • Implement data monitoring best practices with trigger based alerts for data processing KPIs and anomalies
  • Investigate and remediate data problems, performing and documenting thorough and complete root cause analyses. Make recommendation for mitigation and prevention of future issues.
  • Work with Business and IT to assess efficacy of all legacy data sources, making recommendations for migration, anonymization, archival and/or destruction.
  • Continually seek to optimize performance through database indexing, query optimization, stored procedures, etc.
  • Ensure compliance with data governance and data security requirements, including data life cycle management, purge and traceability.
  • Create and manage documentation and change control mechanisms for all technical design, implementations and systems maintenance.

Target Skills and Experience

  • Bachelor's or graduate degree in computer science, information systems or related field preferred, or similar combination of education and experience
  • At least 10 years’ experience designing and managing data pipelines, schema modeling, and data processing systems.
  • Experience with Databricks a plus (or similar tools like Microsoft Fabric, Snowflake, etc.) to drive scalable data solutions.
  • Experience with SAP a plus
  • Proficient in Python, with a track record of solving real-world data challenges.
  • Advanced SQL skills, including experience with database design, query optimization, and stored procedures.
  • Experience with Terraform or other infrastructure-as-code tools is a plus.
Not Specified
Sales Analyst
✦ New
Salary not disclosed
Bentonville, AR 4 hours ago

Summary

The Sales Analyst will serve as a point of contact on customer and internal portals with queries about products, item set-up, item management, pricing management and provide support for sales while being a liaison, working cross functionally with operations, customer service and marketing services.


Duties and Responsibilities

  • Manages all order discrepancies daily and resolves with sales and customer.
  • Assists Sales group in the execution of Customer Based Portals.
  • Check data accuracy and maintain all portals to include 1World Sync.
  • Contact clients to obtain missing information or answer queries.
  • Liaise with the marketing service department to provide timely assistance for sales requests.
  • Maintain and update sales and customer records.
  • Stay up to date with new products and features.
  • Resolve administrative problems by analyzing information and identifying and communicating solutions.
  • Directs administrative productivity in accordance with management directives.
  • Accomplishes department and organization mission by completing related tasks and projects as needed.
  • Plan, organize, and coordinate relevant information across departments
  • Respond to concerns or questions from sales team and assist with resolutions
  • Maintain customer planning calendar
  • Supporting multiple Sales Directors
  • Assist with maintaining and updating sales reporting


Qualifications & Experience

  • Proven work experience as a Sales Administrator or Sales Analyst
  • Proficiency in the MS Office Suite with advanced Excel and PowerPoint skills required.
  • Hands-on experience with data input systems (e.g. CRM, DOMO software) preferred.
  • Understanding of sales performance metrics.
  • Excellent organizational and multitasking skills.
  • Ability to work under strict deadlines.
  • Process management and improvement
  • Ability to establish strong internal and external relationships.
  • Effective written and verbal communication skills.
  • Ability to manage multiple priorities at once.
  • Technical aptitude to operate in several different systems simultaneously.
  • Proven ability to balance high level of details accurately with speed of resolution and quality of service.


Competencies/ Skills

  • Innovation Mindset
  • Team Player
  • Problem Solving/Analytical
  • Attention to detail and strong organizational skills
  • Project/Time Management (manage priorities and workflow)
Not Specified
Data Analyst Manager
✦ New
Salary not disclosed
Hickory, NC 4 hours ago

Who We Are

At Feetures, movement is our business. And we believe that a meaningful business begins with authentic values—and our values were forged by the bonds of family.

What started as a bold idea around a kitchen table has grown into a fast-moving, purpose-driven brand redefining performance. As a family-owned company in North Carolina, we’re fueled by the belief that better is always possible—and that energy drives both our products and our culture.

Movement is at the heart of everything we do. From our socks to our team and to our communities, we are always pushing forward. If you are ready to grow, challenge the status quo, and help shape the next chapter of a brand that is always in stride, come move with us. Feetures is Meant to Move. Are you?


Role Summary:

The Data Analytics Manager is responsible for owning and optimizing the organization’s end-to-end data ecosystem, ensuring that data infrastructure, governance, and analytics processes effectively support business operations. This role leads the design and management of the data stack—from source system integrations and NetSuite Analytics Warehouse to reporting and business intelligence tools—while establishing strong data governance standards, quality monitoring, and documentation practices. The manager also oversees and mentors analytics team members, prioritizes analytics requests, and coordinates cross-functional data workflows. Acting as the central authority for data reliability and insights, the role ensures consistent metric definitions, scalable data models, and accurate reporting while translating complex data into clear, actionable insights for business stakeholders.


Responsibilities:

Data Architecture & Tooling

  • Own the end-to-end data stack — from source system integrations and the NetSuite Analytics Warehouse to downstream reporting layers
  • Evaluate, select, and implement tools that improve data accessibility, reliability, and performance
  • Ensure alignment between data infrastructure and evolving business needs across distribution operations
  • Design and maintain scalable data models, SuiteQL queries, and saved searches within NetSuite

Data Governance & Quality

  • Define and enforce data standards, metric definitions, and naming conventions across all business domains
  • Establish data ownership, lineage documentation, and access governance policies
  • Implement monitoring and alerting for data quality issues across source systems and the warehouse
  • Build and maintain a data dictionary that serves as the single source of truth for the organization

Orchestration of Analysts & Systems

  • Manage and mentor the Data Analyst and Business Analyst — prioritizing requests, unblocking work, and validating outputs
  • Triage and prioritize the analytics request queue in alignment with business stakeholders and IT leadership
  • Coordinate cross-functional data workflows and ensure handoffs between systems and analysts are clean and documented
  • Serve as the escalation point for data discrepancies, report failures, and analytical questions from the business


Qualifications:

Required

  • 3-5 years of experience in data analytics, business intelligence, or data engineering
  • 2+ years in a lead or management role overseeing analysts or data team members
  • Strong proficiency in SQL; experience with SuiteQL or similar ERP query languages
  • Hands-on experience with NetSuite, including Analytics Warehouse, saved searches, and reporting
  • Proven track record establishing data governance standards and documentation practices
  • Experience integrating and managing multiple data sources across SaaS and ERP platforms
  • Demonstrated ability to translate complex data into clear, actionable insights for non-technical stakeholders

Preferred

  • Experience in distribution, wholesale, or supply chain environments
  • Familiarity with SaaS BI platforms (e.g., Tableau, Power BI, Looker, or embedded analytics)
  • Exposure to scripting or automation (JavaScript, Python, or similar) for data workflows
  • Background working within IT-led or hybrid IT/Analytics teams


Benefits:

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life & Disability insurance
  • 401(K) with company match


Company Paid holidays and PTO:

  • Feetures offers 20 PTO Days which are available to you on day one of employment and are available to all employees, no matter your role. After working at Feetures for 5 years, your PTO days will increase to 25 days. Days can be used for vacations, appointments and sick days.
  • We offer 10 company paid holidays and 1 floating holiday per year.


Perks:

  • Parking provided (Charlotte office and onsite at Hickory office)
  • Employee Engagement team
  • Monthly stipend to pursue an active lifestyle


Feetures is an Equal Opportunity Employer that welcomes and encourages all applicants to apply regardless of age, race, sex, religion, color, national origin, disability, veteran status, sexual orientation, gender identity and/or expression, marital or parental status, ancestry, citizenship status, pregnancy or other reasons protected by law.

Not Specified
Teradata Infrastructure DBA
✦ New
Salary not disclosed
Plano, TX 4 hours ago

Must have

Teradata platform expertise

• Deep knowledge of Teradata architecture: parsing, BYNET, AMP, vproc, fallback, hashing, PDCR, fallback, and spool management.

• Data distribution and primary index design; collecting statistics and understanding optimizer behavior.

• Experience with recent Teradata versions and releases migration/upgrade planning: TD 16.XX, TD 17.XX and preferably TD 20.XX.

System administration

• Provisioning and managing Teradata nodes and clusters (physical and virtual).

• OS-level skills: Linux administration (SLES/RHEL/CentOS/Oracle Linux) for Teradata on Linux, including kernel tuning, package management, user and permissions management.

• Storage subsystem knowledge: SAN, NAS, Fibre Channel, LUNs, RAID, and how storage impacts Teradata I/O and spool.

Performance tuning and troubleshooting

• SQL query and plan analysis; collecting and interpreting Explain plans.

• Workload management (WLM) and resource allocation: query prioritization, throttling, and KRI/SLAs.

• Monitoring and diagnostics: using Teradata tools and logs to analyze spool, CPU, memory, disk I/O, network, BYNET contention.

Backup, recovery & high availability

• Best practices for backups restore procedures, and disaster recovery (DR) planning and testing.

• Knowledge of fallback, AMP resilience, replication methods and physical vs logical protection.

Security & compliance

• DB and platform-level security: roles, privileges, LDAP/Kerberos integration, encryption (at rest/in transit), auditing and compliance (SOx and Others as applicable).

• Secure configuration and hardening practices.

Networking & infrastructure

• Network architecture for Teradata clusters, VLANs, link aggregation, low-latency requirements, and BYNET tuning.

• Integration with enterprise infrastructure: DNS, NTP, monitoring stacks, and identity providers.

Automation, scripting & tools

• Scripting languages: Bash, Python, Perl for automation, maintenance, and custom monitoring. – one of them

• Configuration management and automation tools: Ansible, Terraform, Chef, or Puppet (as used in the enterprise). – one of them

• Familiarity with Teradata utilities and tools: BTEQ, FastLoad, MultiLoad, TPT (Teradata Parallel Transporter), DBSControl, Viewpoint, Teradata Studio/SQL Assistant. – one of them

Observability & tooling

• Use of monitoring/alerting tools (Viewpoint, Prometheus, Grafana, Splunk, Nagios, etc.) and designing dashboards and alerts. One of them, View point is mandatory

• Capacity planning, trending, and forecasting for CPU, disk, spool, and concurrency.

Soft skills & organizational capabilities

• Incident management and on-call experience

• Leading postmortems, RCA (root-cause analysis), implementing corrective actions.

• Communication and stakeholder management: vendors, management and applications

• Translate technical impacts to business stakeholders; coordinate with DBAs, developers, network/storage teams, and vendors.


Role and Responsibilities

Installs, configures and upgrades Teradata software and related products.

• Backup, restore, migrate Teradata data and objects

• Establish and maintain backup and recovery policies and procedures.

• Manages and monitor system performance. proactively monitor the database systems to ensure secure services with minimum downtime

• Implements and maintains database security.

• Sets up and maintains documentation and standards.

• Supports multiple Teradata Systems including independent marts/ enterprise warehouse.

• Work with the team to ensure that the associated hardware resources allocated to the databases and to ensure high availability and optimum performance.

• Responsible for improvement and maintenance of the databases to include rollout and upgrades.

• Responsible for implementation and release of database changes as submitted by the development team, Working with end customer

• Teradata, customer, datacenter, vendor co-ordinations

• Forecast data, security audits

• User account and access management

• Teradata active system management and customer requests and system allocation

• Backup and recovery

• SOX compliance and audits

• DB support from 3rd party vendors

• Product evaluations

• On call support and major incidents

• Backup restore, frequency and retention

• Disaster recovery

• Create long r

Not Specified
jobs by JobLookup
✓ All jobs loaded