Full Stack Engineer working with Data Scientists, Perm, Morrisville, NC, $60.00-$67.00/hr
Hays Specialist Recruitment is working in partnership with IQVIA to manage the recruitment of this position
The end client is unable to sponsor or transfer visas for this position; all parties authorized to work in the US without sponsorship are encouraged to apply.
* Collaborating with data scientists, data engineers and other developers to turn working prototypes into well-abstracted, reusable python modules for iterative development of data science projects.
* Overseeing technical aspects of greenfield projects from concept to completion.
* Digging into a variety of databases to engineer a data pipeline to extract datasets for ML training and predictions.
* Providing technical leadership and assisting the team on execution of technical tasks centered on delivery of analytics models as containerized applications.
* Identifying opportunities for improvements of applications like improving services response time and horizontal scaling
* Comprehensive testing of your own code.
* Production deployments of microservices to k8s cluster through a CI/CD pipeline that you will design and setup.
Skill & Requirements
* Bachelors or master's degree in STEM field such as Computer Science, Engineering, Statistics, Mathematics, Biotechnology.
* Expert familiarity with Python 3 supported by 5+ years of backend programming experience using object-oriented and functional paradigms
* Experience with relational databases and good understanding of SQL
* Experience with non-relational databases like MongoDB and Redis
* Experience designing and implementing Rest API, with frameworks like Flask and Falcon
* Proficient understanding of designing microservices based applications
* Strong unit testing and debugging skills
* Proficient understanding of code versioning tools such as Git
* Linux proficiency and experience with containerization tools such as Docker, Kubernetes
* Experience in following Scrum best practices
* Experience in putting machine learning models into production
* Familiarity with advanced Python data structures like numpy arrays and pandas
* Familiarity with process of building python packages
* Experience with workflow managers like Airflow, Azkaban or Luigi
* Familiarity with Hadoop ecosystem like YARN, Hive, Impala, HDFS.
* Experience working with large volumes of complex data, preferably in distributed frameworks such as Spark
* Experience with deploying code into production through CI/CD tools like Jenkins
* Experience in programming in Scala
* Experience with ELK stack or Kafka
It's easy, and free! Add jobs from any website! Get recommendations from your friends! Start by adding this job...