- Collecting customer information needs, information infrastructure, data sources and the ecosystem of end users.
- Designing, implementing and maintaining the necessary data pipelines for optimal data extraction,
transformation and loading from a wide variety of data sources (ETL).
- Performing different types of analyses on structured and unstructured data to answer specific business questions.
- Supporting the evolution of the architecture of data management platforms.
- Developing data transformation jobs, selecting the right tool for each job.
- Transferring knowledge to users.
- Taking training courses and certifications of technologies related to their tasks.
- Participating and contributing content to the IT Community.
Skills and requirements
- Solid knowledge in configuration and use of databases (SQL, SSAS, SSIS, Polybase)
- Solid knowledge in database management, modeling, queries and ETL processing.
- Clear understanding of modern data warehouse architectures and concepts of structured and unstructured data types.
- Experience in Cloud services (Azure, AWS, IBM, Google)
- Experience with SQL and NoSQL relational databases (SQL Server, Postgres)
- Experience with programming languages (Python, Powershell, Bash, C #, .Net).
- Experience with workflow management tools and data pipelines (SSIS, Pentaho, Talend, Airflow).
- Basic knowledge and configuration of networks (DHCP, DNS, TCP / IP, VPN)
- Very good visual, oral and written communication skills.
- Excellent learning and adaptation skills.
- Intermidiate level of English.
- University degree training in Systems, Informatics, Computer Science or a related field.
- Basic Linux management.
- Experience in management, versioning and code repository.
- Experience with agile management methodologies.
- Management of Hadoop ecosystem tools (Flume, Sqoop, Yarn, Hive, Spark).
- Experience as a BI Data Analyst.