About the Customer’s Project:
This is the world’s largest family history and consumer genetics company. With more than 27+ billion digitized global historical records, 100 million family trees, and 18 million people in a growing DNA database, our client helps customers discover their family story and gain a new level of understanding about their lives. 
 
Responsibilities:
- Writing Python scripts for data files processing (MS Excel, CSV, JSON)
- Writing with Alation REST API: register and describe databases sctructure, schemas, tables
- Working with enterprise tools engineer to deploy the data pipelines in Tibco Data Quality tool or Data Migration tool on Server
- Bulding CI/CD pipelines and automated jobs in Jenkins
- Data Quality and Data Migration tool implementation and maintenance
- Evaluate large datasets, address and resolve reconcilable discrepancies through data analysis, find opportunities for continuous data improvement
- Identify automation opportunities within current analytics processes and work closely with architecture team to implement the solution
 
Knowledge, Skills, And Competencies:
- 3+ years of professional experience in a DevOps role
- Good knowledge of the Amazon Web Services (AWS) ecosystem
- Good  Python 3 knowledge, experience with libraries: Pandas, Requests
- Working knowledge of Data Quality tools such as Trillium, Ataccama, Information Builders iWay, or other such tools is required
- Experience with Terraform and AWS basics (EC2, Lambda, Secret Manager)
- Experience development & maintenance of CI/CD pipelines
- SQL language proficiency with at least one of the common database platforms such as Teradata, Redshift, Netteza, Oracle, and/or DB2 database management systems.
- Experience in data mining, large scale data acquisition, auditing, transformation, and cleaning, both structured and unstructured data
- Competency in data manipulation and analysis: accessing raw data in varied formats with different methods
- Experience in managing the Data Lifecycle with specific focus on Data Quality practices and data issue management/root cause analysis
- Upper-intermediate level of English
 
It will be a plus:
​​- Bachelor’s Degree in Computer Science, Software Engineering or Engineering
- AWS certifications 
- Experience with Jenkins and Groovy
- Experience with Tibco iWay Service Manager, Data Quality Server, Data Migration software experience
- GIT experience (basic)
 
What do we offer our new colleague?
- Interesting full-time position on the new project
- Competitive compensation (based on market data, but also depending on the technical level of the candidate)
- Flexible work schedule
- 3 health packages to choose
- Annual paid vacation and state holidays celebration
- Free English classes (online)
- Individual approach to a profession
- Lack of bureaucracy and micromanagement
- Modern, comfortable office facilities (a barbecue zone, kitchens, company paid lunches, lounge rooms, coffee machines, etc.)
- Foreign business trips (after the war)
- On-site parking lot and charge station for Electric Cars
- Corporate gifts, celebrations, and fun activities
- Sports activities: ping-pong, soccer, work-out