Array Definition In Data Structure Jobs in Usa

55,158 positions found — Page 8

Databricks - Lead Data Engineer
✦ New
Salary not disclosed
Atlanta, GA 4 hours ago

We Are Hiring: Databricks Lead Data Engineer – Director Equivalent Role

Location: Atlanta, USA

Work Model: Hybrid – 3 to 4 days in office per week (mandatory)

Eligibility: US Citizens and Green Card (GC) holders only

How to Apply

If you are interested in this position and have the required skills, please send across your resume at:

 ; ;


Paves Technologies is seeking a highly experienced Databricks Lead Data Engineer – Lead Level (Director Equivalent Role) to drive enterprise-scale data architecture, governance, and advanced analytics initiatives on Azure Cloud. This is a senior leadership role requiring deep Databricks expertise, strong data modeling capabilities, and hands-on architectural ownership across PySpark based distributed systems.

Role Overview

The ideal candidate will bring 10-12 + years of overall data engineering experience, including strong hands-on expertise with Azure Databricks, PySpark, Python, and Azure Cloud data services. You will define architecture standards, lead modernization initiatives, and implement scalable Medallion Architecture (Bronze, Silver, Gold layers) to support enterprise analytics and business intelligence.

Key Responsibilities

  • Lead end-to-end architecture and implementation of enterprise-scale data platforms using Azure Databricks on Azure Cloud.
  • Design and implement Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake best practices.
  • Build scalable PySpark-based ETL/ELT pipelines across ingestion (Bronze), transformation (Silver), and curated analytics (Gold) layers.
  • Develop advanced data transformations using Python, PySpark, Spark SQL, and advanced SQL constructs.
  • Architect robust data models (dimensional, star schema, normalized models) aligned to analytics and reporting needs.
  • Drive adoption of advanced Databricks capabilities including Unity Catalog, Declarative Pipelines, Delta Lake optimization, and governance frameworks.
  • Establish best practices for partitioning strategies, file compaction, Z-ordering, caching, broadcast joins, and query optimization.
  • Define and standardize reusable Azure Cloud data platform tools, templates, CI/CD frameworks, and infrastructure automation.
  • Work across Azure ecosystem components such as Azure Data Factory (ADF), Azure Data Lake Storage (ADLS), Azure DevOps, networking, and security services.
  • Ensure high standards for data quality, RBAC, lineage tracking, governance, and production stability.
  • Provide architectural leadership and mentorship to data engineering teams.


Required Experience & Skills

  • 10–12+ years of overall experience in Data Engineering.
  • Minimum 3+ years of strong hands-on Databricks experience.
  • Mandatory Certifications:
  • Databricks Certified Data Engineer Associate
  • Databricks Certified Data Engineer Professional
  • Deep hands-on expertise in PySpark, Python programming, and distributed Spark processing.
  • Strong experience designing and implementing Medallion Architecture (Bronze/Silver/Gold layers).
  • Advanced knowledge of Data Modeling, Data Analysis, and complex SQL (window functions, CTEs, execution plan tuning).
  • Strong understanding of Delta Lake architecture, schema evolution, partition strategies, performance optimization, and data governance.
  • Well-versed in enterprise Azure Cloud data platforms, reusable accelerators, CI/CD templates, and governance standards.
  • Proven experience architecting scalable, secure, cloud-native data solutions.
  • Strong leadership, stakeholder management, and executive communication skills.

How to Apply

If you are interested in this position and have the required skills, please send across your resume at:

; ;

Not Specified
Manufacturing Data & Sales Analyst
✦ New
🏢 LHH
Salary not disclosed
Addison, IL 1 day ago

LHH Recruitment Solutions has partnered with a growing organization, and they are seeking a motivated Manufacturing Data & Sales Analyst to join their team. Seeking a data-driven analytics professional who thrives at the intersection of manufacturing operations, business intelligence, and executive decision support. This is a high-impact role for someone who enjoys building insight from the ground up—designing dashboards, automating reporting, owning data integrity, and translating complex information into clear, actionable business outcomes.


Why This Role Stands Out:

  • High visibility and direct partnership with senior leadership.
  • Opportunity to own and evolve enterprise-level analytics and reporting.
  • Manufacturing environment where data truly drives strategy.
  • Long-term growth potential in a stable, well-capitalized organization.


Key Responsibilities:

Data, Analytics & Reporting:

  • Design, build, and continuously enhance dashboards, scorecards, and KPI reporting to support operational and commercial performance.
  • Translate raw data into meaningful insights that influence decision-making at the executive level.
  • Automate recurring reports and analytics processes to improve efficiency, accuracy, and scalability.
  • Analyze trends related to revenue, production performance, forecasting, and product initiatives.

Manufacturing & Cross-Functional Partnership:

  • Collaborate closely with Operations, Finance, IT, and Commercial teams to align data, metrics, and performance goals.
  • Support forecasting, planning cycles, and performance reviews with reliable, actionable analytics.
  • Identify risks, opportunities, and performance gaps within data sets and recommend solutions.

Systems & Data Ownership:

  • Act as the primary owner of manufacturing and sales-related data systems, ensuring usability, accuracy, and value.
  • Lead continuous improvement of reporting tools and system integrations.
  • Partner with internal and external stakeholders to enhance system reporting capabilities.
  • Champion data governance, consistency, and best practices across the organization.


Qualifications and Skills:

  • Bachelor’s Degree in Data Science, Analytics, Business Intelligence, or a related field
  • Proven experience building and maintaining dashboards, scorecards, and analytics tools.
  • Background supporting a manufacturing environment.
  • Strong ability to own data end-to-end—from extraction to interpretation to executive presentation.
  • Experience automating reporting and analytics processes.
  • Advanced analytical, problem-solving, and critical-thinking skills.
  • Ability to clearly communicate insights to both technical and non-technical audiences.
  • Advanced proficiency with Excel, reporting platforms, and Microsoft Office Suite.
  • Advanced proficiency in SQL, PowerBI, and/or Tableau.
  • Experience with IQMS is preferred.
  • Strategic mindset with exceptional attention to detail.


Compensation Range: $90,000 - $120,000 + 15% Bonus


Benefits Offered: 2 weeks of vacation, paid sick leave where applicable by state law, Medical Insurance, Dental Insurance Vision Insurance, 401K, and Life Insurance.


If you are a passionate Manufacturing Data & Sales Analystlooking for anew and rewarding career, please apply today! You don’t want to miss out on this opportunity!


LHH is a leader in permanent recruitment—and in the placement of top talent. Our areas of specialty include office administration, customer service, human resources, engineering, and supply chain and logistics. Please feel to check us out and apply for other opportunities if this role isn’t a perfect match.


Equal Opportunity Employer/Veterans/Disabled


To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit

Not Specified
Data Quality Control Specialist - 249388
✦ New
Salary not disclosed
Las Vegas, NV 1 day ago

**Seeking a Data Quality Control Specialist in Las Vegas, NV**


Pay: $28- 35 / hr

Schedule: Full time, onsite, 40 hrs a week


Las Vegas, NV | On-site

Seeking a detail-driven Data Quality Control Specialist to support the accuracy, integrity, and compliance of clinical trial documentation across multiple studies. This role is ideal for an experienced clinical research professional who thrives in data review, quality oversight, and audit readiness.

What You’ll Do:

  • Coordinate and oversee clinical data across various phases of clinical trials, ensuring accuracy and completeness
  • Perform quality control (QC) reviews of source documents, medical records, eSource, and essential trial documentation
  • Identify and communicate data discrepancies, protocol deviations, and documentation issues to PIs and Study Coordinators
  • Collaborate with clinical teams to ensure adherence to SOPs, Good Documentation Practices (GDP), and GCP guidelines
  • Support audit and inspection readiness, including internal QC efforts and inspection prep
  • Monitor key data quality KPIs and assist in driving continuous quality improvement initiatives
  • Partner cross-functionally to uphold data integrity, regulatory compliance, and site quality standards


What We’re Looking For:

  • Bachelor’s degree in Clinical Research, Health Sciences, or related field (or equivalent experience)
  • 3+ years of experience in clinical research, data management, QA/QC, or a related role
  • Strong understanding of GCP, GDP, and regulatory requirements
  • Experience reviewing clinical research documentation (source, CRFs/eCRFs, medical records)
  • Familiarity with eSource platforms (CRIO strongly preferred)
  • Detail-oriented, organized, and process-driven with strong communication skills
  • Comfortable collaborating with coordinators, investigators, and cross-functional teams


Nice to Have:

  • Site-level clinical research experience (CRC, Senior CRC, Data or Regulatory focused roles)
  • Audit or inspection preparation experience
  • Passion for data integrity and clinical trial quality
Not Specified
Lead Data Engineer
Salary not disclosed
Atlanta, GA 3 days ago

Job Title – Lead Data Engineer

Please note this role is not able to offer visa transfer or sponsorship now or in the future


About the role


As a Lead Data Engineer, you will make an impact by designing, building, and operating scalable, cloud‑native data platforms supporting batch and streaming use cases, with strong focus on governance, performance, and reliability. You will be a valued member of the Data Engineering team and work collaboratively with cross‑functional engineering, cloud, and architecture stakeholders.


In this role, you will:

  • Design, build, and operate scalable cloud‑native data platforms supporting batch and streaming workloads with strong governance, performance, and reliability.
  • Develop and operate data systems on AWS, Azure, and GCP, designing cloud‑native, scalable, and cost‑efficient data solutions.
  • Build modern data architectures including data lakes, data lakehouses, and data hubs, with strong understanding of ingestion patterns, data governance, data modeling, observability, and platform best practices.
  • Develop data ingestion and collection pipelines using Kafka and AWS Glue; work with modern storage formats such as Apache Iceberg and Parquet.
  • Design and develop real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks, with understanding of event‑driven architectures and low‑latency data processing.
  • Perform data transformation and modeling using SQL‑based frameworks and orchestration tools such as dbt, AWS Glue, and Airflow, including Slowly Changing Dimensions (SCD) and schema evolution.
  • Use Apache Spark extensively for large‑scale data transformations across batch and streaming workloads.


Work model

We believe hybrid work is the way forward as we strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a hybrid position requiring 4 days a week in a client or Cognizant office in Atlanta, GA. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.


The working arrangements for this role are accurate as of the date of posting. This may change based on the project you’re engaged in, as well as business and client requirements. Rest assured; we will always be clear about role expectations.


What you need to have to be considered

  • Hands‑on experience developing and operating data systems on AWS, Azure, and GCP.
  • Proven ability to design cloud‑native, scalable, and cost‑efficient data solutions.
  • Experience building data lakes, data lakehouses, and data hubs with strong understanding of ingestion patterns, governance, modeling, observability, and platform best practices.
  • Expertise in data ingestion and collection using Kafka and AWS Glue, with experience in Apache Iceberg and Parquet.
  • Strong experience designing and developing real‑time streaming pipelines using Kafka, Flink, or similar streaming frameworks.
  • Deep expertise in data transformation and modeling using SQL‑based frameworks and orchestration tools including dbt, AWS Glue, and Airflow, with knowledge of SCD and schema evolution.
  • Extensive experience using Apache Spark for large‑scale batch and streaming data transformations.


These will help you stand out

  • Experience with event‑driven architectures and low‑latency data processing.
  • Strong understanding of schema evolution, SCD modeling, and modern data modeling concepts.
  • Experience with Apache Iceberg, Parquet, and modern ingestion/storage patterns.
  • Strong knowledge of observability, governance, and platform best practices.
  • Ability to partner effectively with cloud, architecture, and engineering teams.



Salary and Other Compensation:

Applications will be accepted until March 17, 2025.

The annual salary for this position is between $81,000 - $135,000, depending on experience and other qualifications of the successful candidate.

This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.

Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:

  • Medical/Dental/Vision/Life Insurance
  • Paid holidays plus Paid Time Off
  • 401(k) plan and contributions
  • Long‑term/Short‑term Disability
  • Paid Parental Leave
  • Employee Stock Purchase Plan


Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

Not Specified
Senior Data Scientist
Salary not disclosed
Vienna, VA 3 days ago

SteerBridge Strategies is a modern technology company delivering innovative, mission‑focused solutions to the U.S. Government and private sector. Leveraging deep expertise in federal acquisition, digital transformation, and emerging technologies, we deliver agile, commercial‑grade capabilities that accelerate operational effectiveness and drive measurable mission success.

At the core of SteerBridge is our people—especially the veterans whose leadership, problem‑solving mindset, and commitment to excellence elevate every project we support. We don’t simply hire exceptional talent; we cultivate it, creating meaningful career pathways for veterans, military spouses, and professionals who share our passion for advancing technology and strengthening the missions we serve.


SteerBridge is looking for a Data Scientist to evaluate multi-dimensional USMC C130 global supply

chain and operational data to construct and maintain predictive models. Candidates must be

familiar with multiple types of data models including, but not limited to, generalized linear

and multilinear regression, logistic and multinomial regression, and time series analysis.

Candidates must have hand-on experience with supervised (classification, regression)

and unsupervised learning (clustering, dimension reduction).


Qualifications

  • Must be a U.S. Citizen.
  • MSc or PhD degree in applied mathematics, statistics, or relevant work experience.
  • An active security clearance or the ability to obtain one is required.
  • Collaborate with various stakeholders to understand requirements and translate those requirements into data science solutions.
  • Provide guidance on best practices and industry standards across data science and analytics, data visualization, and share expertise to improve technical capabilities of the team.
  • Design, develop, and integrate templates, data, and models for repeatability.
  • Develop and implement data quality assurance and management protocols.
  • Create, maintain, and organize technical documentation for all data collection, cleaning, and analyses.


Required and Preferred Skillsets

  • Must be familiar with multiple types of data models including, but not limited to, generalized linear and multilinear regression, logistic and multinomial regression, and time series analysis.
  • Must have hand-on experience with supervised (classification, regression) and unsupervised learning (clustering, dimension reduction).
  • 7+ years of experience evaluating relationships in data using statistical modeling and leveraging analytics tools.
  • 7+ years of experience in advanced Classification and Regression modeling.
  • 7+ years of professional proficiency using R, or Python for data wrangling and model building.
  • Experience in SQL or Spark SQL, and basic database design.
  • Cloud project work using Google, AWS and/or Azure.
  • Demonstrated high proficiency in statistical analysis and data visualization.
  • Demonstrated high proficiency in data wrangling and documentation.
  • Solid technical skills across a wide variety of tools and data platforms.
  • Able to successfully prioritize and manage multiple project tasks simultaneously and complete them in a timely manner with a high degree of accuracy.
  • Strong record of applied data analysis.
  • Excellent writing and presentation skills with a successful track record of communicating complex concepts to diverse audiences.
  • Aviation Background Required!
  • Preferred:
  • (Highly preferred) AWS or Google Cloud Professional or Specialty Certification or ability to obtain certification.
  • Top Secret security clearance.
  • Experience with supply chain management data systems and technology is desirable (e.g., ERP, Transportation Management and Warehouse Management systems).
  • Experience supporting DoD and/or VA missions.
  • Proficiency in integrating and interfacing with software development processes.
  • Consulting experience.
  • RAG, Embedding, Vector DB, hugging face transformer, BERT, BART, LLMs


Benefits

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Life Insurance
  • 401(k) Retirement Plan with matching
  • Paid Time Off
  • Paid Federal Holidays
Not Specified
Databricks Architect/ Senior Data Engineer
✦ New
🏢 OZ
Salary not disclosed
Boca Raton, FL 1 day ago

OZ – Databricks Architect/ Senior Data Engineer


Note: Only applications from U.S. citizens or lawful permanent residents (Green Card Holders) will be considered.


We believe work should be innately rewarding and a team-building venture. Working with our teammates and clients should be an enjoyable journey where we can learn, grow as professionals, and achieve amazing results. Our core values revolve around this philosophy. We are relentlessly committed to helping our clients achieve their business goals, leapfrog the competition, and become leaders in their industry. What drives us forward is the culture of creativity combined with a disciplined approach, passion for learning & innovation, and a ‘can-do’ attitude!


What We're Looking For:

We are seeking a highly experienced Databricks professional with deep expertise in data engineering, distributed computing, and cloud-based data platforms. The ideal candidate is both an architect and a hands-on engineer who can design scalable data solutions while actively contributing to development, optimization, and deployment.


This role requires strong technical leadership, a deep understanding of modern data architectures, and the ability to implement best practices in DataOps, performance optimization, and data governance.


Experience with modern AI/GenAI-enabled data platforms and real-time data processing environments is highly desirable.


Position Overview:

The Databricks Senior Data Engineer will play a critical role in designing, implementing, and optimizing enterprise-scale data platforms using the Databricks Lakehouse architecture. This role combines architecture leadership with hands-on engineering, focusing on building scalable, secure, and high-performance data pipelines and platforms. The ideal candidate will establish coding standards, define data architecture frameworks such as the Medallion Architecture, and guide the end-to-end development lifecycle of modern data solutions.


This individual will collaborate with cross-functional stakeholders, including data engineers, BI developers, analysts, and business leaders, to deliver robust data platforms that enable advanced analytics, reporting, and AI-driven decision-making.


Key Responsibilities:

  • Architecture & Design: Architect and design scalable, reliable data platforms and complex ETL/ELT and streaming workflows for the Databricks Lakehouse Platform (Delta Lake, Spark).
  • Hands-On Development: Write, test, and optimize code in Python, PySpark, and SQL for data ingestion, transformation, and processing.
  • DataOps & Automation: Implement CI/CD, monitoring, and automation (e.g., with Azure DevOps, DBX) for data pipelines.
  • Stakeholder Collaboration: Work with BI developers, analysts, and business users to define requirements and deliver data-driven solutions.
  • Performance Optimization: Tune delta tables, Spark jobs, and SQL queries for maximum efficiency and scalability.
  • GenAI Applications Development: It is a big plus to have experience in GenAI application development


Requirements:

  • 8+ years of experience in data engineering, with strong hands-on expertise in Databricks and Apache Spark.
  • Proven experience designing and implementing scalable ETL/ELT pipelines in cloud environments.
  • Strong programming skills in Python and SQL; experience with PySpark required.
  • Hands-on experience with Databricks Lakehouse, Delta Lake, and distributed data processing.
  • Experience working with cloud platforms such as Microsoft Azure, AWS, or GCP (Azure preferred).
  • Experience with CI/CD pipelines, Git, and DevOps practices for data engineering.
  • Strong understanding of data architecture, data modeling, and performance optimization.
  • Experience working with cross-functional teams to deliver enterprise data solutions.
  • Tackles complex data challenges, ensuring data quality and reliable delivery.


Qualifications:

  • Bachelor’s degree in Computer Science, Information Technology, Engineering, or a related field.
  • Experience designing enterprise-scale data platforms and modern data architectures.
  • Experience with data integration tools such as Azure Data Factory or similar platforms.
  • Familiarity with cloud data warehouses such as Databricks, Snowflake, or Azure Fabric.
  • Experience supporting analytics, reporting, or AI/ML workloads is highly desirable.
  • Databricks, Azure, or cloud certifications are preferred.
  • Strong problem-solving, communication, and technical leadership skills.


Technical Proficiency in:

  • Databricks, Apache Spark, PySpark, Delta Lake
  • Python, SQL, Scala (preferred)
  • Cloud platforms: Azure (preferred), AWS, or GCP
  • Azure Data Factory, Kafka, and modern data integration tools
  • Data warehousing: Databricks, Snowflake, or Azure Fabric
  • DevOps tools: Git, Azure DevOps, CI/CD pipelines
  • Data architecture, ETL/ELT design, and performance optimization


What You’re Looking For:

Join a fast-growing organization that thrives on innovation and collaboration. You’ll work alongside talented, motivated colleagues in a global environment, helping clients solve their most critical business challenges. At OZ, your contributions matter – you’ll have the chance to be a key player in our growth and success. If you’re driven, bold, and eager to push boundaries, we invite you to join a company where you can truly make a difference.


About Us:

OZ is a 28-year-old global technology consulting, services, and solutions leader specializing in creating business-focused solutions for our clients by leveraging disruptive digital technologies and innovation.


OZ is committed to creating a continuum between work and life by allowing people to work remotely. We offer competitive compensation and a comprehensive benefits package. You’ll enjoy our work style within an incredible culture. We’ll give you the tools you need to succeed so you can grow and develop with us and become part of a team that lives by its core values.

Not Specified
Warehouse Associate (Data Center)
✦ New
Salary not disclosed
Fairfax County, VA 1 day ago

This is a great opportunity for anyone with construction, fabrication, or trade experience (or just a strong work ethic and willingness to learn) to launch a stable career with growth potential.


What You’ll Do as a Field Technician – Entry-Level (Construction / Data Centers)

As a Field Technician, you’ll:

  • Install, assemble, and modify containment systems that improve cooling efficiency in data centers
  • Perform specialized cleaning and decontamination of equipment and areas to keep facilities running at peak performance
  • Assist with deliveries, organize materials, and maintain tools and equipment
  • Follow direction from supervisors to complete tasks safely, accurately, and on time
  • Identify and report potential risks, always prioritizing safety
  • Represent the company professionally with clients and team members


What We’re Looking For in a Field Technician – Entry-Level (Construction / Data Centers)

  • 0–2 years of construction, technician, or trade experience (data center experience is a plus)
  • U.S. citizenship or naturalized citizen, 18+ years old
  • Reliable transportation to job sites
  • Able to pass a background check and drug screen
  • Comfortable working at heights, around noise, and in temperatures from 0–100°F+
  • Physically able to lift 50 lbs and stay on your feet most of the day
  • Positive attitude, strong work ethic, and good communication skills


Schedule & Pay for Field Technician – Entry-Level (Construction / Data Centers)

  • Monday–Friday, 6:00 AM to 3:00 PM (overtime available)
  • Full-time, on-site role
  • Competitive hourly pay with overtime opportunities
  • Full training, safety gear (PPE), and on-the-job mentorship provided


Why Join Us?

  • Be part of the growing data center industry
  • Gain hands-on technical skills with full training
  • Work with a supportive team in a professional environment
  • Build a career with opportunities for advancement


Apply today and start your career in data center construction with a growing technology company!

Not Specified
Exceptions Specialist - Data Entry
✦ New
Salary not disclosed

Exceptions Specialist - Data Entry (Political Campaign Support)

Position Type : Contract

Payrate : $22/hr

Work Location

100% On-site in WestLake Village, CA.


Shifts

  1. PM : Mon-Fri: 4 pm – 11:45 pm PST
  2. AM : Mon-Fr - 8am - 5pm PST
  3. Weekends - 8am - 5pm PST.

Position Summary

In this Exceptions role with Campaign Offices, you are the final line of defense for data accuracy. You will review voter signature packets and data entry to spot issues, connect the dots, and decide the correct next step in processing. This is a fast-paced, detail driven role for someone who enjoys solving puzzles, thinking critically, and using deductive reasoning to get to the right answer.

If you like investigating discrepancies, catching what others miss, and making clear decisions based on criteria, you will do well here.


What You Will Do

  • Review and compare entered data against original source materials to identify errors, mismatches, and exceptions
  • Apply deductive reasoning to determine what happened, why it happened, and what action should be taken next
  • Use proprietary software and established criteria to resolve exceptions and keep work moving efficiently
  • Investigate patterns and root causes behind recurring issues and document your findings
  • Communicate clear feedback and insights that help prevent future exceptions
  • Manage physical and digital materials with accuracy, organization, and urgency
  • Work independently while collaborating with the team to hit daily goals and maintain quality standards


What We Are Looking For

  • Strong deductive reasoning skills and the ability to make accurate decisions using logic and evidence
  • High attention to detail and commitment to getting it right the first time
  • Ability to stay focused in a fast-paced environment with repetitive work that requires consistency
  • Strong organizational skills and material management
  • Clear communication skills and a team-first attitude
  • Comfort using software tools for data entry, review, and analysis
  • Experience in data entry, quality assurance, compliance, or audit-type work is a plus


Contract Details

  • 5-month contract assignment paid on an hourly basis
  • Full-time schedule with consistent hours (AM or PM Shifts available)
  • Work that directly supports a high-impact, time-sensitive campaign operation


What you get:

• Full-time 40 hours per week

• Health benefits with low premiums

• A chance to support meaningful work with a team that moves fast and values accuracy

Not Specified
Datacenter Technician
✦ New
Salary not disclosed
Reno, NV 4 hours ago

Main Duties / Required:

  • Knowledge and use of basic telecom hand tools.
  • Must understand customer service.
  • Clear understanding of job safety requirements.
  • Be able to read and understand floor plans.
  • Should be familiar on wiring schemes and wiring testing.
  • Should be able to pull all types of low voltage cable.
  • Should understand and be capable in performing field terminations and labeling.
  • Reports to Operations Manager and take daily directions from Technician, Technician II, Lead Technicians, Senior Technicians and Advance Senior Technicians.
  • Capable of maintaining orderly paperwork, capable of running service jobs.
  • Possess skill to layout MDF and IDF closets, be able to dress all types of cable, and perform all type of terminations.
  • Capable of working in Data Centers
  • Shall be able to install ladder racking and seismic bracing both above and under raise floor.
  • Basic understanding of both copper and fiber cable testing and troubleshooting.
  • Read and understand blueprints and design document
  • Dress and furcate fiber trunks for splicing
  • Maintaining orderly paperwork and running service
  • Fusion Splice including Ribbon/Single OSP/ISP
  • Install, connect, and decom network equipment
  • Operate DSX 5000 tester/OTDR Tester
  • Program testers
  • Download test results to Linkware/Linkware Live
  • Save test results, verify, and submit to customer
  • Create mass labels and apply per Portmap
  • Differentiate live cables from decom cable
  • Copper testing and troubleshooting
  • Conduct Service Swaps of live networking devices
  • Understand "FIM" database and operate scanners


PHYSICAL REQUIREMENTS

  • Primarily walking, standing, and bending for extended periods with some sitting.
  • Ability to communicate effectively with verbal, written, visual and listening skills.
  • Dexterity of hands and fingers to operate any required equipment as well as to operate a computer keyboard, mouse, and other technical instruments.
  • Able to lift and carry heavy equipment, up to 50 pounds.
  • Ability to pull cables.
  • Ability to climb ladders an


Nice to have Skills:

Key Skills / Words: (at least 6)

Data Center

Technician

Decommission

Splicing

Low Voltage

Not Specified
SAP Data Engineer
✦ New
Salary not disclosed
Omaha, NE 4 hours ago

Duration: 12 Months

Job Description:

Experience Level: Senior, 10-15 Years of Experience in SAP Reporting Platforms

Must have: BW, BI, Datasphere and SAC

Good To Have: Experience converting from On-Prem to Cloud Data Products, SAC Planning, BDC

Desired Profile: Very Strong in Core SAP Data Products - BW, BI, Datasphere, SAC

  • Experience In Data Modeling and Creating Data Products
  • Good To Have Experience with SAC Planning and SAP Business Data Cloud
  • Very Good Communication Skills.
  • Very Good Analytical Skills and Problem-Solving Ability a Must.
  • Knowledge Of Agile Development Methodology is a Plus.
  • Must Be Capable of Doing Code Reviews and Mentoring the Junior Developers to Drive Towards High Quality Deliverables.
  • Strong Background Culture of Delivering Projects with First Time Right / Zero Defects in Production.

About USTech Solutions:

"US Tech Solutions is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran."


Recruiter Details:

Name: Rajeev

Email:

Internal Job ID: 26-06170

Not Specified
jobs by JobLookup
✓ All jobs loaded