We are looking for a talented individual to join our Engineering team to help manage and evolve the critical systems and code base that powers our data & analytics products. This role requires strong bias for action, comfort in producing high velocity, high quality output, and mastery of data movement and manipulation across disparate systems. This is a full-time onsite role in our downtown Asheville office and is eligible for our full array of benefits.
Exhibit Python mastery by writing excellent, maintainable, and scalable code
Work with multiple data types and databases to assist in the creation, movement, & maintenance of our data & analytics products
Troubleshoot data pipeline and system issues while addressing technical debt to improve system stability and scaling.
Accurately scope and estimate work to assist with overall project and resource planning
Frequently communicate and collaborate with other data engineers and developers
On-call rotation for occasional Linux server management, following runbooks
BS in Computer Science, Software Systems Engineering, Information Technology or related field
3+ years of general-purpose Python coding
2+ years of in data systems, data integrity, and data quality assurance processes
2+ years of experience working with continuous-integration and development/production environments
Data administration, architecture, and optimizing big data pipelines
Data visualization in Python, Tableau, or other BI tools
Scrum and Agile software development
Networking in AWS and Linux
System security with AWS IAM
DevOps in any mainstream configuration management software
Containerization within systems like Docker
Must Have Skills:
Advanced SQL – relational and columnar
Familiarity with MySQL or Postgresql
ETL tools and data processing pipelines
Advanced Python (v2 & v3)
Strong technical documentation skills
Advanced python data analysis and manipulation libraries (Ex: NumPy, SciPy, Pandas/Dask, Matplotlib, etc.)
Strong understanding of monitoring and alerting systems
Nice to Have:
Orchestration tools like Apache Airflow
Web Framework (Django, Flask, Tornado)
Business Intelligence (Tableau, PowerBI, Looker, etc)
About BuildFax, a Verisk Business: BuildFax collects and organizes the data that helps companies solve today's critical property-related problems. The BuildFax U.S. Property History database is a proprietary property intelligence engine and data resource that contains building and permitting information from cities and counties throughout the country. Verisk Analtyics is an American data analytics and risk assessment firm serving customers worldwide in insurance, natural resources, financial services, government and risk management.
BACK TO TOP
Power Platform UGs Career Center is Just One of the Benefits.
Discover what else User Group has to offer!
The job you are trying to reach from was originally posted at Power Platform UGs Career Center.
Copyright 2018 Dynamic Communities. All rights reserved.