Terraform Github Repository Jobs Remote Jobs in Usa
2 positions found
As a Data Science/Data Engineer Intern, you will work on cutting-edge analytical and data engineering projects that drive measurable business impact across pricing, underwriting, marketing, and claims.
This internship is ideal for a technically curious, motivated problem-solver who wants hands-on data science experience.
RESPONSIBILITIES
- Support the design, construction, and optimization of robust data pipelines to enable machine learning and analytical modeling.
- Contribute to the design and implementation of data and ML workflows using orchestration tools such as Dagster, Airflow, or similar frameworks.
- Help implement data quality checks, validation routines, and monitoring for automated data workflows.
- Assist in organizing and managing internal GitHub repositories to standardize ML project structures and best practices.
- Collaborate with data scientists and engineers to automate the ingestion, transformation, and delivery of data for model development.
- Contribute to initiatives migrating analytical processes into cloud-based data lake architectures and modern platforms such as AWS or Snowflake.
- Develop reusable and well-tested code to support analytical pipelines and internal tools using Python and SQL.
- Conduct data mining, cleansing, and preparation tasks to build high-quality analytical datasets.
- Participate in model development, including data profiling, model training, validation, and interpretation.
- Build and evaluate predictive models that enhance profitability through improved segmentation and estimation of insurance risk.
- Assist in studies evaluating new business models for customer segmentation, retention, and lifetime value.
- Collaborate with business leaders to translate insights into operational improvements and cost efficiencies.
QUALIFICATIONS
- Currently pursuing or recently completed a Master’s in Data Science, Computer Science, Statistics, Economics, or related field.
- Proficiency in Python (Pandas, NumPy, Scikit-learn, XGBoost, or PyTorch) and SQL.
- Understanding of data engineering concepts, ETL/ELT workflows, and machine learning deployment.
- Exposure to workflow orchestration tools (e.g., Airflow, Dagster, Prefect) and Git/GitHub for collaborative development.
- Familiarity with Docker, CI/CD pipelines, and infrastructure-as-code tools such as Terraform preferred.
- Knowledge of AWS cloud services such as S3, Lambda, EC2, or SageMaker a plus.
- Experience with common modeling techniques (e.g., GLM, tree-based models, Bayesian statistics, NLP, deep learning) through coursework or projects.
- Strong analytical, communication, and problem-solving skills.
- A self-starter mindset, with attention to detail and enthusiasm for learning new technologies.
SALARY RANGE
The pay range for this position is $35 hourly.
ABOUT THE COMPANY
The Plymouth Rock Company and its affiliated group of companies write and manage over $2 billion in personal and commercial auto and homeowner’s insurance throughout the Northeast and mid-Atlantic, where we have built an unparalleled reputation for service. We continuously invest in technology, our employees thrive in our empowering environment, and our customers are among the most loyal in the industry. The Plymouth Rock group of companies employs more than 1,900 people and is headquartered in Boston, Massachusetts. Plymouth Rock Assurance Corporation holds an A.M. Best rating of “A-/Excellent."
This individual shall apply new and emerging technologies to the software development process.
Required Skills & Qualifications: B.S.
in Computer Science, Engineering, Mathematics, or equivalent experience.
Advanced full-stack software development experience, building enterprise web and middle-tier applications using Angular and core Java with Spring/Spring Boot.
Angular 16+.
Leadership to guide, encourage, and motivate your fellow engineers.
Experience working in an Agile Scrum development environment.
Experience in REST API development via a gateway.
Experience with Docker, Kubernetes, Terraform, and AWS cloud deployment/application management.
Experience with unit testing and test automation libraries/strategies (Cypress/Karate/Cucumber).
Experience building and deploying applications using continuous integration pipelines and automated deployment tools such as Jenkins.
Experience using source control and pull requests for collaborative development in code repository tools such as GitHub.
Strong communications and problem-solving skills.
Preferred Skills: Experience with PDF generation
- understanding PDF reporting Docker Kubernetes Terraform Jenkins Duties and Responsibilities Developing and deploying software in a fast-paced environment.
Collaborating with colleagues on technical implementation and process improvement.
Working closely with technology and business partners to design new features.
Passion for learning the latest technologies and frameworks.
Building positive relationships within and across teams.
Mentor and be mentored by your team members and partners.