Analyst/Developer Hadoop

Analyst/Developer Hadoop

  • Location

    Brussels, Belgium

  • Sector:


  • Job type:


  • Salary:


  • Contact:

    Miroslav Marchev

  • Contact email:


  • Job ref:


  • Published:

    about 1 year ago

  • Duration:

    6 mois

  • Expiry date:


  • Start date:


Analyst Developer Hadoop

Job location: Brussels, Belgium
Start Date: March-April 2019

Our client is the leading telecommunications company in Belgium and a market leader in a number of areas, including retail and wholesale fixed-line telephony services, mobile communications services and broadband data and Internet services. "We drive gender balanced leadership"

You are part of a squad team that works in an agile framework (scrum). All squad members sit physically together and work on a specific business domain (eg. Network & Customer Experience, Location Insights, Value Analytics, …)
Your role
You will work closely with your squad colleagues and be responsible for the analysis, development, testing and follow up of the implementation of project deliverables with a focus on quality:
·load and integrate data from our multitude of applications, service platforms and networks in the Enterprise Data Warehouse/Data Lake via ETL solutions
·creating datamarts and/or reports/dashboards
The analyst/developer is a key contact person for internal clients and takes care of optimal communication and collaboration with them and all other teams needed, in and outside our domain.
He/she contributes to the organization and optimization of team processes.
You make sure that your (and your colleagues') solutions are matching the requirements regarding security (retail/wholesale, legal, …) and confidentiality, defined by Legal- and Security- teams and by our Datawarehouse Data Security Officer (DSO).
You have to make sure that solutions make optimal use of available computer capacity (CPU, I/O, storage, memory, …), available on our platforms (Teradata, Hadoop, Oracle, … ) and guarantee optimal performance.
Required Profile and skills:
·Data Integration
oProfound knowledge of key concepts of Business Intelligence, Data Warehousing and entity-relationship (E-R)-, Third Normal Form (3NF)- and dimensional modeling (knowledge on RDA helps)
oGood knowledge on Hadoop and related technologies (Hortonworks Hadoop stack, HBase, MapReduce, Storm, Hive, Kafka, Scala, Spark ...) Scala & Spark are mandatory!
oKnowledge and experience in:
§Scripting (Perl, Java, Python, R, ...)
§NoSQL principles and systems: document stores, graph databases, KVP stores
§Parallel data processing (MPP) and tuning
§Statistical Modeling
§modeling tools (RDA)
You can submit your CV via the website or send me your application

Speak to you soon