Cogent Data Solutions Llc Jobs in Usa
12,193 positions found — Page 2
Location: Newark, NJ (Hybri)
Duration: 06 months
Job Description
- Build and maintain data pipelines that collect, store, and transform data to support analytics use cases and business outcomes.
- Implement data ingestion and transformation workflows in Microsoft Fabric, using Fabric-native capabilities such as notebooks, pipelines, and lakehouse patterns.
- Develop and operationalize data solutions across lakehouse layers (e.g., landing and standardized "Bronze" data through curated "Silver/Gold" outputs) aligned to the platform's workspace architecture and OneLake design.
- Ensure data solutions are reliable and supportable by incorporating monitoring, issue resolution, and ongoing enhancements to pipelines and datasets.
- Collaborate across teams (engineering, analytics, product, and stakeholders) to translate data needs into scalable, reusable solutions and improved workflow efficiency.
- Support secure and appropriate use of Fabric assets by following established access and workspace practices.
About Pinterest:
Millions of people around the world come to our platform to find creative ideas, dream about new possibilities and plan for memories that will last a lifetime. At Pinterest, we're on a mission to bring everyone the inspiration to create a life they love, and that starts with the people behind the product.
Discover a career where you ignite innovation for millions, transform passion into growth opportunities, celebrate each other's unique experiences and embrace theflexibility to do your best work. Creating a career you love? It's Possible.
At Pinterest, AI isn't just a feature, it's a powerful partner that augments our creativity and amplifies our impact, and we're looking for candidates who are excited to be a part of that. To get a complete picture of your experience and abilities, we'll explore your foundational skills and how you collaborate with AI.
Through our interview process, what matters most is that you can always explain your approach, showing us not just what you know, but how you think. You can read more about our AI interview philosophy and how we use AI in our recruiting process here.
About tvScientific
tvScientific is the first and only CTV advertising platform purpose-built for performance marketers. We leverage massive data and cutting-edge science to automate and optimize TV advertising to drive business outcomes. Our solution combines media buying, optimization, measurement, and attribution in one, efficient platform. Our platform is built by industry leaders with a long history in programmatic advertising, digital media, and ad verification who have now purpose-built a CTV performance platform advertisers can trust to grow their business.
As aStaff Data Engineer at tvScientific, you will be a key player in implementing the robust data infrastructure to power our data-heavy company. You will collaborate with our cross-functional teams to evolve our core data pipelines, design for efficiency as we scale, and store data in optimal engines and formats.This is anindividual contributor role, where you will work to define and implement a strategic vision for data engineering within the organization.
What you'll do:
- Design and implement robust data infrastructure in AWS, using Spark with Scala
- Evolve our core data pipelines to efficiently scale for our massive growth
- Store data in optimal engines and formats, matching your designs to our performance needs and cost factors
- Collaborate with our cross-functional teams to design data solutions that meet business needs
- Design and implement knowledge graphs, exposing their functionality both via Batch Processing and APIs
- Leverage and optimize AWS resources while designing for scale
- Collaborate closely with our Data Science and Product teams
- How we'll define success:
- Successful design and implementation of scalable and efficient data infrastructure
- Timely delivery and optimization of data assets and APIs
- High attention to detail in implementation of automated data quality checks
- Effective collaboration with cross-functional teams
What we're looking for:
- Production data engineering experience
- Proficiency in Spark and Scala, with proven experience building data infrastructure in Spark using Scala
- Experience in delivering significant technical initiatives and building reliable, large scale services
- Experience in delivering APIs backed by relationship-heavy datasets
- Familiarity with data lakes, cloud warehouses, and storage formats
- Strong proficiency in AWS services
- Expertise in SQL for data manipulation and extraction
- Excellent written and verbal communication skills
- Bachelor's degree in Computer Science or a related field
- Nice-to-haves:
- Experience in adtech
- Experience implementing data governance practices, including data quality, metadata management, and access controls
- Strong understanding of privacy-by-design principles and handling of sensitive or regulated data
- Familiarity with data table formats like Apache Iceberg, Delta
- Previous experience building out a Data Engineering function
- Proven experience working closely with Data Science teams on machine learning pipelines
In-Office Requirement Statement:
- We recognize that the ideal environment for work is situational and may differ across departments. What this looks like day-to-day can vary based on the needs of each organization or role.
Relocation Statement:
- This position is not eligible for relocation assistance. Visit ourPinFlexpage to learn more about our working model.
#LI-SM4
#LI-REMOTE
At Pinterest we believe the workplace should be equitable, inclusive, and inspiring for every employee. In an effort to provide greater transparency, we are sharing the base salary range for this position. The position is also eligible for equity. Final salary is based on a number of factors including location, travel, relevant prior experience, or particular skills and expertise.
Information regarding the culture at Pinterest and benefits available for this position can be found here.
US based applicants only$155,584—$320,320 USDOur Commitment to Inclusion:
Pinterest is an equal opportunity employer and makes employment decisions on the basis of merit. We want to have the best qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, color, ancestry, national origin, religion or religious creed, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, age, marital status, status as a protected veteran, physical or mental disability, medical condition, genetic information or characteristics (or those of a family member) or any other consideration made unlawful by applicable federal, state or local laws. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you require a medical or religious accommodation during the job application process, please completethis formfor support.
The pay range for this role is $150,000 - $200,000/yr USD.
WHO WE ARE:
Headquartered in Southern California, Skechers—the Comfort Technology Company®—has spent over 30 years helping men, women, and kids everywhere look and feel good. Comfort innovation is at the core of everything we do, driving the development of stylish, high-quality products at a great value. From our diverse footwear collections to our expanding range of apparel and accessories, Skechers is a complete lifestyle brand.
ABOUT THE ROLE:
Skechers Digital Team is seeking a Digital Data Architect reporting to the Director, Digital Architecture, Consumer Domain. This role is responsible for designing and governing Skechers’ Consumer Data 360 ecosystem, enabling identity resolution, high-quality data foundations, personalization, loyalty intelligence, and machine learning capabilities across digital and retail channels.
The ideal candidate will be a strong technical leader, have hands-on full-stack technical knowledge in enterprise technologies related to Skecher’s consumer domain, and have the ability to work in a fast-paced agile environment. You should have knowledge of consumer programs from an architecture/industry perspective, and you should have strong hands-on experience designing solutions on the Salesforce Core Platform (including configuration, integration, and data model best practices).
You will work cross-functionally with Digital Engineering, Data Engineering, Data Science, Loyalty, and Marketing teams to architect scalable, secure, and high-performance data platforms that support advanced personalization and recommender systems.
WHAT YOU’LL DO:
- Responsible for the full technical life cycle of consumer platform capabilities which includes:
- Capability roadmap and technical architecture in alignment to consumer experience
- Technical planning, design, and execution
- Operations, analytics/reporting, and adoption
- Define and evolve Skechers’ Consumer Data 360 architecture, including identity resolution (deterministic and probabilistic matching) and unified customer profiles.
- Architect scalable data models and pipelines across CDP, CRM, e-commerce, marketing automation, data lake, and warehouse platforms.
- Establish enterprise data quality frameworks including validation, deduplication, anomaly detection, and observability.
- Optimize SQL workloads and large-scale distributed queries through performance tuning, partitioning, indexing, and workload management strategies.
- Design and oversee ML pipelines supporting personalization, churn modeling, and recommender systems.
- Partner with Data Science teams to productionize models using distributed platforms such as Databricks (Spark, Delta Lake, MLflow preferred).
- Ensure secure data governance, access control (RBAC/ABAC), and compliance with GDPR, CCPA, and related privacy regulations.
- Provide architectural oversight ensuring performance, scalability, resilience, and maintainability.
- Collaborate with stakeholders to translate business objectives (LTV growth, personalization lift, engagement) into scalable data solutions.
REQUIREMENTS:
- Computer Science, Data Engineering, or related degree or equivalent experience.
- 12+ years experience architecting enterprise data platforms in cloud environments.
- 9+ years experience with data engineering with a focus on consumer data.
- 6+ years experience working with Salesforce platforms, including data models and enterprise integrations.
- Strong experience with Data 360 and identity resolution architectures.
- Proven expertise in SQL performance tuning and large-scale data modeling.
- Hands-on experience implementing ML pipelines and recommender systems in production environments.
- Experience with cloud technologies (AWS, GCP, or Azure).
- Experience with integration patterns (API, ETL, event streaming).
- Experience providing technical leadership and guidance across multiple projects and development teams.
- Experience translating business requirements into detailed technical specifications and working with development teams through implementation, including issue resolution and stakeholder communication.
- Strong project management skills including scope assessment, estimation, and clear technical communication with both business users and technical teams.
- Must hold at least one of the following Salesforce Certifications (Platform App Builder, Platform Developer 1, JavaScript Developer 1).
- Experience with Databricks or similar distributed data/ML platforms preferred.
Hi ,
This is Vamshi ,from Software Technology We have a job opening with our client for position DBA/DATA Architect If you are available and looking for any new opportunities, please send me your updated resume for below position ASAP.
Job Title: DBA/DATA Architect
Location: Denver, CO (Hybrid 3 days work form office)
Duration: Full Time/Longterm Contract
Must have skills: Architect level good; but should be hands-on with proactive driver mindset (individual contributor; no team management responsibility)
Skills To Be Evaluated On
Azure Data Factory, MS SQL Server DBA, Performance Tuning, Data Pipeline, Data Management
Technical Skills
- Azure Data Factory, Azure Data Lake Storage (ADLS), Azure Databricks, Azure SQL Database.
- Strong SQL, Python, Scala, and PySpark.
- Experience in building data models and schema design.
- Experience with SSIS
- Oracle DB experience would be a Plus
Roles & Responsibilities
- Design, build, and optimize data pipelines and ETL processes using Azure Data Factory and Databricks.
- Manage Azure SQL Database, including performance tuning, indexing, and query optimization.
- Design scalable data solutions, data lakes (ADLS), and data warehousing solutions.
- Ensure data security, quality, and integrity throughout the lifecycle.
- Monitor data pipelines and resolve performance issues or failures.
- Work with data scientists and analysts to support data-driven decision-making.
Thanks,
Vamshi Thangadpalli
Technical Recruiter
Email: | Web: :// Overlook Center, Suite 200
Princeton, NJ 08540.
Overall Responsibility:
This role supports the design, development, and optimization of Arora’s enterprise data and ERP systems. This role reports directly under the Data Analytics Manager to improve financial reporting, support platform integrations, and build scalable data architecture that enables informed decision-making across the organization.
The position combines technical execution (SQL, automation, system configuration) with financial reporting support and cross-platform integration work to ensure accuracy, efficiency, and long-term system sustainability.
Essential Functions:
- Execute reporting and system requests in alignment with established data governance standards and reporting frameworks under the direction of the Data Analytics Manager.
- Contribute to the design of data models and system workflows that reduce manual processes and improve cross-functional data visibility.
- Support internal dashboards by creating backend data solutions and integrating with Vision.
- Provide system-level troubleshooting and ensure data consistency and reliability across platforms.
- Collaborate with teams to streamline processes through automation and data tools.
- Maintain documentation of data procedures, workflows, and system modifications.
- Support financial reporting and analysis by developing standardized, scalable reporting solutions aligned with company-wide data architecture.
- Assist in translating financial and operational requirements into structured reporting outputs and automation workflows.
- Assist in platform integrations (ERP, CRM, BI tools, and other enterprise systems) to support long-term architectural alignment and scalability.
Needed Skills:
- Ability to program in SQL at an expert level to assist data processes. Potential need for other programming language knowledge (Java, Python, etc.).
- Ability to create and maintain productive relationships with employees, clients, and vendors.
Education/Experience Minimum:
- 3-5 years of experience
- Strong programming skills having the ability to write complex queries.
- Preferred familiarity with all Microsoft platforms, including but not limited to Excel, Power BI, SharePoint, and SQL Server.
- Preferred experience with Deltek Vision v7.6 and VantagePoint
- Experience in building automated processes and data workflows.
- Strong problem-solving and attention to detail.
Our client, a leading organization in the real estate investment and property management sector, is seeking a Data Analyst to join their IT team. This role will focus on building and maintaining data solutions, developing dashboards, and delivering actionable insights to support business decision-making across the organization.
Responsibilities
- Build and maintain data warehouses and reporting solutions
- Develop dashboards and ad hoc reports using Power BI
- Write and optimize SQL queries for data extraction and analysis
- Partner with stakeholders to gather and translate reporting requirements
- Troubleshoot and resolve data and reporting issues
- Support internal users with reporting tools and best practices
- Contribute to data governance and process improvements
- Stay current with Microsoft Fabric, Azure, and related technologies
Qualifications
- Bachelor’s degree in Computer Science, Statistics, Mathematics, or related field
- 3+ years of experience in data analysis and visualization
- Strong experience with Power BI
- Advanced SQL skills
- Experience with SQL Server, SSRS, SSIS, Excel, Power Query, and Power Pivot
- Experience with Microsoft Fabric preferred
- Familiarity with Azure environments preferred
- Experience with Azure DevOps Git is a plus
- Strong analytical, problem-solving, and communication skills
Sr. Data Engineer (Hybrid)
Chicago, IL
The American Medical Association (AMA) is the nation's largest professional Association of physicians and a non-profit organization. We are a unifying voice and powerful ally for America's physicians, the patients they care for, and the promise of a healthier nation. To be part of the AMA is to be part of our Mission to promote the art and science of medicine and the betterment of public health.
At AMA, our mission to improve the health of the nation starts with our people. We foster an inclusive, people-first culture where every employee is empowered to perform at their best. Together, we advance meaningful change in health care and the communities we serve.
We encourage and support professional development for our employees, and we are dedicated to social responsibility. We invite you to learn more about us and we look forward to getting to know you.
We have an opportunity at our corporate offices in Chicago for a Sr. Data Engineer (Hybrid) on our Information Technology team. This is a hybrid position reporting into our Chicago, IL office, requiring 3 days a week in the office.
As a Sr. Data Engineer, you will play a key role in implementing
and maintaining AMA's enterprise data platform to support analytics,
interoperability, and responsible AI adoption. This role partners closely with
platform engineering, data governance, data science, IT security, and business
stakeholders to deliver highquality, reliable, and secure data products. This
role contributes to AMA's modern lakehouse architecture, optimizing data
operations, and embedding governance and quality standards into engineering
workflows. This role serves as a
senior technical contributor within the team-providing mentorship to junior
engineers and implementing engineering best practices within the data platform function,
in alignment with architectural direction set by leadership.
RESPONSIBILITIES:
Data Engineering & AI Enablement
- Build and maintain scalable data pipelines and
ETL/ELT workflows supporting analytics, operational reporting, and AI/ML use
cases. - Implement best practice patterns for ingestion,
transformation, modeling, and orchestration within a modern lakehouse
environment (e.g., Databricks, Delta Lake, Azure Data Lake). - Develop highperformance
data models and curated datasets with strong attention to quality, usability,
and interoperability; create reusable engineering components and automation. - Collaborate with the Architecture Team, the Data
Platform Lead, and federated IT teams to optimize storage, compute, and
architectural patterns for performance and costefficiency. - Build model-ready data sets and feature
pipelines to support AI/ ML use cases; serve as a technical coordination point
supporting business units' AI-related infrastructure needs. - Collaborate with data scientists and AI Working
Group to operationalize models responsibly and maintain ongoing monitoring
signals.
Governance, Quality & Compliance
- Embed data governance, metadata standards,
lineage tracking, and quality controls directly into engineering workflows;
ensure technical implementation and alignment within engineering workflows. - Work with the Data Governance Lead and business
stakeholders to operationalize stewardship, classification, validation,
retention, and access standards. - Implement privacybydesign and securitybydesign
principles, ensuring compliance with internal policies and regulatory
obligations. - Maintain documentation for pipelines, datasets,
and transformations to support transparency and audit requirements.
Platform Reliability, Observability & Optimization
- Monitor and troubleshoot pipeline failures,
performance bottlenecks, data anomalies, and platformlevel issues. - Implement observability tooling, alerts,
logging, and dashboards to ensure endtoend reliability. - Support cost governance by optimizing compute
resources, refining job schedules, and advising on efficient architecture. - Collaborate with the Data Platform Lead on
scaling, configuration management, CI/CD pipelines, and environment management. - Collaborate with business units to understand
data needs, translate them into engineering requirements, and deliver
fit-for-purpose data solutions; share and apply best practices and emerging
technologies within assigned initiatives. - Work with IT Security and Legal/ Compliance to
ensure platform and datasets meet risk and regulatory standards.
Staff Management
- Lead, mentor, and provide management oversight
for staff. - Responsible for setting objectives, evaluating
employee performance, and fostering a collaborative team environment. - Responsible for developing staff knowledge and
skills to support career development.
May include other responsibilities as assigned
REQUIREMENTS:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or related field preferred or equivalent work experience and HS diploma/equivalent education required.
- 5+ years of experience in data engineering within cloud environments
- Experience in people management preferred.
- Demonstrated hands-on experience with modern data platforms (Databricks preferred).
- Proficiency in Python, SQL, and data
transformation frameworks. - Experience designing and operationalizing
ETL/ELT pipelines, orchestration workflows (Airflow, Databricks Workflows), and
CI/CD processes. - Solid understanding of data modeling,
structured/unstructured data patterns, and schema design. - Experience implementing governance and quality
controls: metadata, lineage, validation, stewardship workflows. - Working knowledge of cloud architecture, IAM,
networking, and security best practices. - Demonstrated ability to collaborate across
technical and business teams. - Exposure to AI/ML engineering concepts, feature
stores, model monitoring, or MLOps patterns. - Experience with infrastructureascode
(Terraform, CloudFormation) or DevOps tooling.
The American Medical Association is located at 330 N. Wabash Avenue, Chicago, IL 60611 and is convenient to all public transportation in Chicago.
This role is an exempt position, and the salary range for this position is $115,523.42-$150,972.44. This is the lowest to highest salary we believe we would pay for this role at the time of this posting. An employee's pay within the salary range will be determined by a variety of factors including but not limited to business consideration and geographical location, as well as candidate qualifications, such as skills, education, and experience. Employees are also eligible to participate in an incentive plan. To learn more about the American Medical Association's benefits offerings, please click here.
We are an equal opportunity employer, committed to diversity in our workforce. All qualified applicants will receive consideration for employment. As an EOE/AA employer, the American Medical Association will not discriminate in its employment practices due to an applicant's race, color, religion, sex, age, national origin, sexual orientation, gender identity and veteran or disability status.
THE AMA IS COMMITTED TO IMPROVING THE HEALTH OF THE NATION
Apply NowShare Save JobRemote working/work at home options are available for this role.
At MVP Health Care, we're on a mission to create a healthier future for everyone. That means embracing innovation, championing equity, and continuously improving how we serve our communities. Our team is powered by people who are curious, humble, and committed to making a difference-every interaction, every day. We've been putting people first for over 40 years, offering high-quality health plans across New York and Vermont and partnering with forward-thinking organizations to deliver more personalized, equitable, and accessible care. As a not-for-profit, we invest in what matters most: our customers, our communities, and our team.
What's in it for you:
- Growth opportunities to uplevel your career
- A people-centric culture embracing and celebrating diverse perspectives, backgrounds, and experiences within our team
- Competitive compensation and comprehensive benefits focused on well-being
- An opportunity to shape the future of health care by joining a team recognized as a Best Place to Work For in the NY Capital District, one of the Best Companies to Work For in New York, and an Inclusive Workplace.
You'll contribute to our humble pursuit of excellence by bringing curiosity to spark innovation, humility to collaborate as a team, and a deep commitment to being the difference for our customers. Your role will reflect our shared goal of enhancing health care delivery and building healthier, more vibrant communities.
The Sr. Quality Data Analyst will be responsible for leading and overseeing operational workflows within the Health Care Quality Analytics team. The ideal candidate will be accountable for ensuring the team delivers routine and ad hoc analyses and data visualizations to support MVP's health care quality functional area. The ideal candidate will have experience working with NCQA and CMS quality measures and HEDIS data to support improved health care outcomes and member satisfaction. They will also participate in automation efforts that create efficiencies and help to create a data-driven organization. The Sr. Quality Data Analyst will work with cross-functional teams, including business, technical, and Data Governance teams, to ensure the availability, accuracy, and reliability of data.
In alignment with MVP's core values, the Sr. Quality Data Analyst will be expected to demonstrate strong interpersonal and communication skills, promoting cooperation across organizational boundaries and encouraging groups to work together cooperatively. They will have strong analytical thinking skills, and a focus on continuously improving processes and reducing technical debt. Additionally, they will be self-motivated, with a sense of accountability and urgency in completing assignments.
Key Responsibilities:
- Lead and oversee the successful execution of operational workflows and health care quality data deliverables.
- Have experience working with HEDIS, Medicare Stars, and NYSDOH QARR measures data and a good understanding of health care quality measurement.
- Conduct analysis of large data sets to support health care quality improvement initiatives, including gap analysis, process optimization, and patient engagement.
- Collaborate with cross-functional teams to design, implement, and maintain data solutions that meet the needs of stakeholders and business partners.
- Ensure the accuracy and integrity of data through the development and implementation of data quality control processes and procedures.
- Provide training and mentorship to team members to promote growth and development.
- Participate in the development of data governance policies, standards, and procedures, and ensure compliance with regulatory requirements and industry best practices.
- Present data insights and recommendations to leadership, effectively communicating complex technical information to non-technical stakeholders.
- Continuously monitor and evaluate the effectiveness of operational workflows, making recommendations for improvements and leading implementation efforts as necessary.
Position Qualifications
Minimum Education
Bachelor's degree in a related field (e.g. Mathematics, Statistics, Computer Science, Epidemiology, or Healthcare) required; Master's degree preferred.
Minimum Experience
5+ years of experience in healthcare data analysis, with a strong focus on health care quality analytics and operations.
Experience leading teams and executing on operational workflows.
Required Skills
- Strong analytical skills, with the ability to turn data into actionable insights.
- Proficiency in SQL, Azure Databricks, data visualization tools (e.g. Tableau, PowerBI), and data manipulation tools (e.g. Alteryx, R, Python).
- Excellent verbal and written communication skills, with the ability to effectively communicate technical information to both technical and non-technical stakeholders.
- Ability to work independently and as part of a team, with strong project management skills and the ability to prioritize tasks effectively.
- Keen attention to detail.
- Subject matter expertise of healthcare industry quality metrics, Medicare Stars and HEDIS standards.
Pay Transparency
MVP Health Care is committed to providing competitive employee compensation and benefits packages. The base pay range provided for this role reflects our good faith compensation estimate at the time of posting. MVP adheres to pay transparency nondiscrimination principles. Specific employment offers and associated compensation will be extended individually based on several factors, including but not limited to geographic location; relevant experience, education, and training; and the nature of and demand for the role.
We do not request current or historical salary information from candidates.
$93,667.00-$124,576.75
MVP's Inclusion Statement
At MVP Health Care, we believe creating healthier communities begins with nurturing a healthy workplace. As an organization, we strive to create space for individuals from diverse backgrounds and all walks of life to have a voice and thrive. Our shared curiosity and connectedness make us stronger, and our unique perspectives are catalysts for creativity and collaboration.
MVP is an equal opportunity employer and recruits, employs, trains, compensates, and promotes without discrimination based on race, color, creed, national origin, citizenship, ethnicity, ancestry, sex, gender identity, gender expression, religion, age, marital status, personal appearance, sexual orientation, family responsibilities, familial status, physical or mental disability, handicapping condition, medical condition, pregnancy status, predisposing genetic characteristics or information, domestic violence victim status, political affiliation, military or veteran status, Vietnam-era or special disabled Veteran or other legally protected classifications.
To support a safe, drug-free workplace, pre-employment criminal background checks and drug testing are part of our hiring process. If you require accommodations during the application process due to a disability, please contact our Talent team at .
This role involves leading complex technology projects, impacting business outcomes through innovative data solutions.
Candidates should have a strong background in data architecture, cloud technologies, and experience mentoring teams.
The successful applicant will engage with clients, ensuring effective delivery and quality management within a dynamic consulting environment.
#J-18808-Ljbffr
Air Force, Space Force, and Navy under the Joint Range Technical Services Contract, better known as J-Tech II.
JT4 develops and maintains realistic, integrated test and training environments and prepares our nation's war-fighting aircraft, weapons systems, and aircrews for today's missions and tomorrow's global challenges.
JOB SUMMARY
- ESSENTIAL FUNCTIONS/DUTIES A Joint Data Network & Information Technology Field Engineer performs or leads complex field engineering assignments and perform a variety of engineering assignments involving technology applications involved in the installation, operation, testing, and maintenance of complex electronic / mechanical equipment and information technology (IT) systems and infrastructures.
Employee will be responsible for the following functions/duties: Apply knowledge of design specifications for more complex systems/projects Coordinate and work closely with other engineering, logistics, financial, and program management disciplines to define system specifications and requirements Recommend technology refresh upgrades for end-of-life systems as a technical subject matter expert Define / write function and security requirements for service delivery of command, control, computer, communication and IT systems, which could include virtual environments and associated applications Manage, design and implement command-and-control, debriefing and other associated information system infrastructure and applications Provide operation, troubleshooting and debugging support for computer operations systems, including system security, access, configuration, backups and restores Assist in incident handling in conjunction with the Facility Security Officer and Information Security Officer / Information Systems Security Manager Assures common engineering principles are consistently applied to internal software, hardware and multimedia inventories Researches engineering solutions to address software and hardware requirements in support of system sustainment and new system projects that handle computing, networking, video, audio and data while taking into consideration emissions and communications requirements Assists in system administration, troubleshooting and remediation of command-and-control, debriefing and other IT systems operation performance issues Verify and comply with engineering documentation standards and test procedures Prepare, deliver, and submit technical papers and perform engineering studies Support development of technical proposals and provide comments on technical content and level of effort of the proposed scope of work Develop, maintain, and produce technical documentation and system/subsystem specifications Direct interface and liaison with customers at all levels from quotation to final design and test activities, design reviews, and technical working group meetings to comply with requirements and specifications Conduct site visits, experimental investigations and analyze engineering problems, propose solutions and alternatives, and provide recommendations Perform other job-related duties, as required DESIRED QUALIFICATIONS Ability to work with a group or independently High energy, multi-disciplined and professional Experience in a variety of disciplines across electronics, computers, networks, radio frequency, datalink, video, mechanical and associated technological fields REQUIREMENTS
- EDUCATION, TECHNICAL, AND WORK EXPERIENCE An associate's degree in engineering or other technical discipline, or formal academic/vocational/military training and a minimum of 12 years of experience in the specialty field are required for this position.
In addition, a Joint Data & Information Technology Field Engineer must possess the following qualifications: Broad knowledge of concepts, principles, and practices of engineering that enable performance as a senior technical contributor on complex projects or programs Knowledge and skill sufficient to apply developments in engineering to solve problems in the specialty area Working knowledge of computer systems and integrated software application programs Ability to investigate, troubleshoot, and design solutions to problems in operational hardware and software Excellent communication and analytical skills Planning/organizational skills BENEFITS Medical, Dental, Vision Insurance Benefits Active on Day 1 Life Insurance Health Savings Accounts/FSA’s Disability Insurance Paid Time Off 401(k) Plan Options with Employer Match JT4 will match 50%, up to an 8% contribution 100% Immediate Vesting Tuition Reimbursement OTHER RESPONSIBILITIES Each employee must read, understand, and implement the general and specific operational, safety, quality, and environmental requirements of all plans, procedures, and policies pertaining to their job.
WORKING CONDITIONS Work is typically performed an office environment with no unusual hazards.
Occasional lifting (up to 750 pounds), constant sitting and use of a computer terminal; constant use of sight abilities while writing, reviewing, and editing documents; constant use of speech/hearing abilities for communication; and constant mental alertness are required.
Travel to remote company work locations will be required.
DISCLAIMER The above statements are intended to describe the general nature and level of work being performed by personnel assigned to this classification.
They are not intended to be construed as an exhaustive list of all responsibilities, duties, and skills required of persons so classified.
Tasking is in support of a Federal Government Contract that requires U.S.
citizenship.
Some jobs may require a candidate to be eligible for a government security clearance, state-issued driver's license, or other licenses/certifications, and the inability to obtain and maintain the required clearance, license, or certification may affect an employee's ability to maintain employment.
SCC: JOTDA18; A3UTTR